[ 380.313335] nova-conductor[52134]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 381.536748] nova-conductor[52134]: DEBUG oslo_db.sqlalchemy.engines [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52134) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 381.562916] nova-conductor[52134]: DEBUG nova.context [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe(cell1) {{(pid=52134) load_cells /opt/stack/nova/nova/context.py:464}} [ 381.564758] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52134) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 381.565098] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52134) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 381.565747] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52134) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 381.566124] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52134) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 381.566319] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52134) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 381.567279] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52134) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 381.572511] nova-conductor[52134]: DEBUG oslo_db.sqlalchemy.engines [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52134) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 381.572879] nova-conductor[52134]: DEBUG oslo_db.sqlalchemy.engines [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52134) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 381.631585] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Acquiring lock "singleton_lock" {{(pid=52134) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 381.631799] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Acquired lock "singleton_lock" {{(pid=52134) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 381.631993] nova-conductor[52134]: DEBUG oslo_concurrency.lockutils [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Releasing lock "singleton_lock" {{(pid=52134) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 381.632430] nova-conductor[52134]: INFO oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Starting 2 workers [ 381.636946] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Started child 52553 {{(pid=52134) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 381.640105] nova-conductor[52553]: INFO nova.service [-] Starting conductor node (version 0.0.1) [ 381.641502] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Started child 52554 {{(pid=52134) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 381.642349] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Full set of CONF: {{(pid=52134) wait /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:649}} [ 381.642584] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ******************************************************************************** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 381.642997] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] Configuration options gathered from: {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 381.643225] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] command line args: ['--config-file', '/etc/nova/nova.conf'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 381.643606] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] config files: ['/etc/nova/nova.conf'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 381.643762] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ================================================================================ {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 381.644773] nova-conductor[52554]: INFO nova.service [-] Starting conductor node (version 0.0.1) [ 381.644981] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] allow_resize_to_same_host = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.645168] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] arq_binding_timeout = 300 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.645359] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] block_device_allocate_retries = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.645590] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] block_device_allocate_retries_interval = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.645763] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cert = self.pem {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.645953] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute_driver = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.646196] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute_monitors = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.646412] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] config_dir = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.646604] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] config_drive_format = iso9660 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.646751] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] config_file = ['/etc/nova/nova.conf'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.646959] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] config_source = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.647158] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] console_host = devstack {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.647347] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] control_exchange = nova {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.647551] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cpu_allocation_ratio = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.647730] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] daemon = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.647929] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] debug = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.648113] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] default_access_ip_network_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.648285] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] default_availability_zone = nova {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.648440] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] default_ephemeral_format = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.648734] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.648930] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] default_schedule_zone = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.649113] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] disk_allocation_ratio = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.649290] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] enable_new_services = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.649584] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] enabled_apis = ['osapi_compute'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.649769] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] enabled_ssl_apis = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.650012] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] flat_injected = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.650196] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] force_config_drive = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.650372] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] force_raw_images = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.650597] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] graceful_shutdown_timeout = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.650775] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] heal_instance_info_cache_interval = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.651212] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] host = devstack {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.651404] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.651577] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] initial_disk_allocation_ratio = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.651739] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] initial_ram_allocation_ratio = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.651975] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.652153] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instance_build_timeout = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.652342] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instance_delete_interval = 300 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.652483] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instance_format = [instance: %(uuid)s] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.652644] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instance_name_template = instance-%08x {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.652829] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instance_usage_audit = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.653029] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instance_usage_audit_period = month {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.653212] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.653610] nova-conductor[52553]: DEBUG oslo_db.sqlalchemy.engines [None req-63bc29ab-a643-45c4-9a89-d7b6ff5a2a1d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52553) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 381.653769] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] instances_path = /opt/stack/data/nova/instances {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.653967] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] internal_service_availability_zone = internal {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.654142] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] key = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.654301] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] live_migration_retry_count = 30 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.654488] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_config_append = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.654658] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.654819] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_dir = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.654997] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.655135] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_options = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.655291] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_rotate_interval = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.655492] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_rotate_interval_type = days {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.655739] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] log_rotation_type = none {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.655994] nova-conductor[52554]: DEBUG oslo_db.sqlalchemy.engines [None req-35b5c8a3-a576-4d13-9356-2b87b213fcc7 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52554) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 381.656159] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.656306] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.656491] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.656676] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.656802] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.656997] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] long_rpc_timeout = 1800 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.657174] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] max_concurrent_builds = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.657329] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] max_concurrent_live_migrations = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.657480] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] max_concurrent_snapshots = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.657662] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] max_local_block_devices = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.657805] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] max_logfile_count = 30 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.657978] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] max_logfile_size_mb = 200 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.658165] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] maximum_instance_delete_attempts = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.658352] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] metadata_listen = 0.0.0.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.658561] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] metadata_listen_port = 8775 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.658745] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] metadata_workers = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.658904] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] migrate_max_retries = -1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.659088] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] mkisofs_cmd = genisoimage {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.659306] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] my_block_storage_ip = 10.180.1.21 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.659490] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] my_ip = 10.180.1.21 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.659661] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] network_allocate_retries = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.659842] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.659999] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] osapi_compute_listen = 0.0.0.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.660176] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] osapi_compute_listen_port = 8774 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.660340] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] osapi_compute_unique_server_name_scope = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.660519] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] osapi_compute_workers = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.660698] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] password_length = 12 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.660860] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] periodic_enable = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.661023] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] periodic_fuzzy_delay = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.661209] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] pointer_model = usbtablet {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.661401] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] preallocate_images = none {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.661569] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] publish_errors = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.661695] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] pybasedir = /opt/stack/nova {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.661896] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ram_allocation_ratio = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.662079] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rate_limit_burst = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.662249] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rate_limit_except_level = CRITICAL {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.662546] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rate_limit_interval = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.662623] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] reboot_timeout = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.662739] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] reclaim_instance_interval = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.662893] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] record = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.663080] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] reimage_timeout_per_gb = 20 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.663259] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] report_interval = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.663418] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rescue_timeout = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.663573] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] reserved_host_cpus = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.663725] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] reserved_host_disk_mb = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.663874] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] reserved_host_memory_mb = 512 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.664044] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] reserved_huge_pages = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.664201] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] resize_confirm_window = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.664362] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] resize_fs_using_block_device = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.664513] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] resume_guests_state_on_host_boot = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.664685] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.664849] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rpc_response_timeout = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.665008] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] run_external_periodic_tasks = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.665200] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] running_deleted_instance_action = reap {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.665357] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] running_deleted_instance_poll_interval = 1800 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.665509] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] running_deleted_instance_timeout = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.665663] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler_instance_sync_interval = 120 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.665885] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_down_time = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.666021] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] servicegroup_driver = db {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.666181] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] shelved_offload_time = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.666335] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] shelved_poll_interval = 3600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.666497] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] shutdown_timeout = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.666658] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] source_is_ipv6 = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.666813] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ssl_only = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.666972] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] state_path = /opt/stack/data/nova {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.667145] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] sync_power_state_interval = 600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.667338] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] sync_power_state_pool_size = 1000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.667465] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] syslog_log_facility = LOG_USER {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.667618] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] tempdir = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.667804] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] timeout_nbd = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.667968] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] transport_url = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.668138] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] update_resources_interval = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.668314] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] use_cow_images = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.668472] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] use_eventlog = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.668644] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] use_journal = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.668797] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] use_json = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.668957] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] use_rootwrap_daemon = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.669133] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] use_stderr = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.669287] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] use_syslog = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.669459] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vcpu_pin_set = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.669664] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vif_plugging_is_fatal = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.669835] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vif_plugging_timeout = 300 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.670056] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] virt_mkfs = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.670225] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] volume_usage_poll_interval = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.670388] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] watch_log_file = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.670597] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] web = /usr/share/spice-html5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 381.670873] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_concurrency.disable_process_locking = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.671081] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.671292] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.671457] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.671626] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.671826] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.672016] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.672252] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.auth_strategy = keystone {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.672451] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.compute_link_prefix = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.672641] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.672810] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.dhcp_domain = novalocal {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.672972] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.enable_instance_password = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.673143] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.glance_link_prefix = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.673308] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.673492] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.673655] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.instance_list_per_project_cells = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.673817] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.list_records_by_skipping_down_cells = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.674066] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.local_metadata_per_cell = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.674295] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.max_limit = 1000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.674449] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.metadata_cache_expiration = 15 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.674632] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.neutron_default_tenant_id = default {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.674804] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.use_forwarded_for = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.674966] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.use_neutron_default_nets = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.675143] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.675337] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.675507] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.675677] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.675860] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.vendordata_dynamic_targets = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.676028] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.vendordata_jsonfile_path = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.676212] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.676426] nova-conductor[52553]: DEBUG nova.service [None req-63bc29ab-a643-45c4-9a89-d7b6ff5a2a1d None None] Creating RPC server for service conductor {{(pid=52553) start /opt/stack/nova/nova/service.py:182}} [ 381.676556] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.backend = dogpile.cache.memcached {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.676737] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.backend_argument = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.676930] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.config_prefix = cache.oslo {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.677146] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.dead_timeout = 60.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.677311] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.debug_cache_backend = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.677472] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.enable_retry_client = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.677632] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.enable_socket_keepalive = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.677796] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.enabled = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.677980] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.expiration_time = 600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.678150] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.hashclient_retry_attempts = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.678312] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.hashclient_retry_delay = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.678472] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_dead_retry = 300 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.678639] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_password = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.678800] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.678957] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.679128] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_pool_maxsize = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.679300] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.679491] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_sasl_enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.679700] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.679869] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_socket_timeout = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.680056] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.memcache_username = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.680278] nova-conductor[52554]: DEBUG nova.service [None req-35b5c8a3-a576-4d13-9356-2b87b213fcc7 None None] Creating RPC server for service conductor {{(pid=52554) start /opt/stack/nova/nova/service.py:182}} [ 381.680421] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.proxies = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.680622] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.retry_attempts = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.680796] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.retry_delay = 0.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.680962] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.socket_keepalive_count = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.681137] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.socket_keepalive_idle = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.681297] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.socket_keepalive_interval = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.681452] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.tls_allowed_ciphers = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.681606] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.tls_cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.681776] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.tls_certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.681955] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.tls_enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.682127] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cache.tls_keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.682335] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.682545] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.auth_type = password {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.682713] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.682907] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.catalog_info = volumev3::publicURL {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.683079] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.683260] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.683436] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.cross_az_attach = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.683595] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.debug = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.683749] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.endpoint_template = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.683925] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.http_retries = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.684093] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.684250] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.684420] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.os_region_name = RegionOne {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.684580] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.684733] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cinder.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.684898] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.685065] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.cpu_dedicated_set = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.685221] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.cpu_shared_set = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.685379] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.image_type_exclude_list = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.685534] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.685694] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.max_concurrent_disk_ops = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.685844] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.max_disk_devices_to_attach = -1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.686059] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.686288] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.686392] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.resource_provider_association_refresh = 300 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.686552] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.shutdown_retry_interval = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.686733] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.686908] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] conductor.workers = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.687160] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] console.allowed_origins = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.687326] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] console.ssl_ciphers = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.687493] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] console.ssl_minimum_version = default {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.687666] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] consoleauth.token_ttl = 600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.687864] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.688094] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.688191] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.688350] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.connect_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.688528] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.connect_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.688718] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.endpoint_override = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.688882] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.689046] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.689205] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.max_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.689356] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.min_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.689532] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.region_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.689697] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.service_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.689890] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.service_type = accelerator {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.690069] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.690229] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.status_code_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.690402] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.status_code_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691037] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691037] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691037] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] cyborg.version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691163] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.backend = sqlalchemy {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691285] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.connection = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691449] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.connection_debug = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691609] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.connection_parameters = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691881] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.connection_recycle_time = 3600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.691965] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.connection_trace = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.692207] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.db_inc_retry_interval = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.692331] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.db_max_retries = 20 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.692448] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.db_max_retry_interval = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.692596] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.db_retry_interval = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.692756] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.max_overflow = 50 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.693224] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.max_pool_size = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.693224] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.max_retries = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.693224] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.mysql_enable_ndb = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.693384] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.693516] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.mysql_wsrep_sync_wait = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.693665] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.pool_timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.694019] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.retry_interval = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.694019] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.slave_connection = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.694159] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.sqlite_synchronous = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.694408] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] database.use_db_reconnect = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.694491] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.backend = sqlalchemy {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.694887] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.connection = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.694887] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.connection_debug = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.695070] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.connection_parameters = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.695140] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.connection_recycle_time = 3600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.695315] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.connection_trace = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.695483] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.db_inc_retry_interval = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.695643] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.db_max_retries = 20 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.695801] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.db_max_retry_interval = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.695960] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.db_retry_interval = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.696306] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.max_overflow = 50 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.696306] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.max_pool_size = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.696507] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.max_retries = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.696635] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.mysql_enable_ndb = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.696788] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.696914] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.697085] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.pool_timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.697242] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.retry_interval = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.697377] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.slave_connection = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.697536] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] api_database.sqlite_synchronous = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.697704] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] devices.enabled_mdev_types = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.697975] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.698078] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ephemeral_storage_encryption.enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.698302] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.698471] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.api_servers = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.698636] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.698817] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.699030] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.699316] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.connect_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.699316] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.connect_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.699481] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.debug = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.699763] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.default_trusted_certificate_ids = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.699813] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.enable_certificate_validation = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.699943] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.enable_rbd_download = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.700149] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.endpoint_override = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.700321] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.700479] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.700636] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.max_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.700799] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.min_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.701024] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.num_retries = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.701083] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.rbd_ceph_conf = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.701232] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.rbd_connect_timeout = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.701400] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.rbd_pool = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.701597] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.rbd_user = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.701724] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.region_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.701911] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.service_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.702142] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.service_type = image {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.702250] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.702652] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.status_code_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.702652] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.status_code_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.702762] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703121] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703121] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.verify_glance_signatures = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703222] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] glance.version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703335] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] guestfs.debug = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703552] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.config_drive_cdrom = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703696] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.config_drive_inject_password = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703843] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.703979] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.enable_instance_metrics_collection = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.704170] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.enable_remotefx = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.704327] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.instances_path_share = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.704528] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.iscsi_initiator_list = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.704719] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.limit_cpu_features = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.704801] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.705155] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.705212] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.power_state_check_timeframe = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.705334] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.705512] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.706016] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.use_multipath_io = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.706016] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.volume_attach_retry_count = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.706016] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.706151] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.vswitch_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.706271] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.706431] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] mks.enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.706933] nova-conductor[52554]: DEBUG nova.service [None req-35b5c8a3-a576-4d13-9356-2b87b213fcc7 None None] Join ServiceGroup membership for this service conductor {{(pid=52554) start /opt/stack/nova/nova/service.py:199}} [ 381.707166] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.707426] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] image_cache.manager_interval = 2400 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.707517] nova-conductor[52554]: DEBUG nova.servicegroup.drivers.db [None req-35b5c8a3-a576-4d13-9356-2b87b213fcc7 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52554) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 381.707995] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] image_cache.precache_concurrency = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.707995] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] image_cache.remove_unused_base_images = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.707995] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.708183] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.708406] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] image_cache.subdirectory_name = _base {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.708579] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.api_max_retries = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.708802] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.api_retry_interval = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.708993] nova-conductor[52553]: DEBUG nova.service [None req-63bc29ab-a643-45c4-9a89-d7b6ff5a2a1d None None] Join ServiceGroup membership for this service conductor {{(pid=52553) start /opt/stack/nova/nova/service.py:199}} [ 381.709144] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.709501] nova-conductor[52553]: DEBUG nova.servicegroup.drivers.db [None req-63bc29ab-a643-45c4-9a89-d7b6ff5a2a1d None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52553) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 381.709731] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.auth_type = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.710041] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.710124] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.710282] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.710462] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.connect_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.710649] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.connect_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.710810] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.endpoint_override = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.710973] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.711140] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.711296] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.max_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.711449] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.min_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.711595] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.partition_key = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.711755] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.peer_list = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.711913] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.region_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.712084] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.serial_console_state_timeout = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.712245] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.service_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.712430] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.service_type = baremetal {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.712588] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.712744] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.status_code_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.712897] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.status_code_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.713056] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.713237] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.713394] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ironic.version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.713593] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.713786] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] key_manager.fixed_key = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.713999] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.714194] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.barbican_api_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.714366] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.barbican_endpoint = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.714561] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.barbican_endpoint_type = public {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.714730] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.barbican_region_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.714884] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.715044] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.715208] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.715362] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.715515] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.715677] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.number_of_retries = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.715836] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.retry_delay = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.716029] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.send_service_user_token = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.716189] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.716341] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.716496] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.verify_ssl = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.716648] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican.verify_ssl_path = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.716811] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.716981] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.auth_type = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.717149] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.717301] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.717457] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.717612] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.717764] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.717922] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.718083] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] barbican_service_user.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.718269] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.approle_role_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.718429] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.approle_secret_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.718584] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.718735] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.718890] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.719053] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.719207] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.719392] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.kv_mountpoint = secret {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.719597] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.kv_version = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.719773] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.namespace = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.719928] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.root_token_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.720117] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.720274] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.ssl_ca_crt_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.720434] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.720639] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.use_ssl = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.720830] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.721030] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.721189] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.721357] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.721519] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.connect_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.721674] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.connect_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.721865] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.endpoint_override = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.722051] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.722208] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.722360] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.max_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.722510] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.min_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.722662] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.region_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.722815] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.service_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.722976] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.service_type = identity {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.723140] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.723294] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.status_code_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.723447] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.status_code_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.723599] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.723777] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.723933] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] keystone.version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.724183] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.connection_uri = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.724367] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.cpu_mode = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.724546] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.cpu_model_extra_flags = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.724708] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.cpu_models = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.724873] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.cpu_power_governor_high = performance {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.725047] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.cpu_power_governor_low = powersave {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.725211] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.cpu_power_management = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.725398] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.725575] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.device_detach_attempts = 8 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.725775] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.device_detach_timeout = 20 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.725957] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.disk_cachemodes = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.726129] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.disk_prefix = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.726337] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.enabled_perf_events = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.726499] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.file_backed_memory = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.726656] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.gid_maps = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.726827] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.hw_disk_discard = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.726981] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.hw_machine_type = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.727157] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.images_rbd_ceph_conf = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.727321] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.727483] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.727649] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.images_rbd_glance_store_name = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.727812] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.images_rbd_pool = rbd {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.727976] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.images_type = default {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.728143] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.images_volume_group = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.728301] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.inject_key = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.728455] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.inject_partition = -2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.728611] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.inject_password = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.728841] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.iscsi_iface = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.728945] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.iser_use_multipath = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.729114] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_bandwidth = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.729273] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.729442] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_downtime = 500 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.729621] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.729784] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.729939] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_inbound_addr = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.730116] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.730275] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_permit_post_copy = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.730428] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_scheme = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.730595] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_timeout_action = abort {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.730749] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_tunnelled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.730902] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_uri = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.731068] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.live_migration_with_native_tls = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.731228] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.max_queues = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.731384] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.731596] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.nfs_mount_options = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.731979] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.732182] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.732352] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.num_iser_scan_tries = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.732512] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.num_memory_encrypted_guests = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.732672] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.732830] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.num_pcie_ports = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.732992] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.num_volume_scan_tries = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.733171] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.pmem_namespaces = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.733326] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.quobyte_client_cfg = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.733564] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.733731] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rbd_connect_timeout = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.733924] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.734113] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.734269] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rbd_secret_uuid = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.734424] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rbd_user = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.734588] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.734758] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.remote_filesystem_transport = ssh {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.734916] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rescue_image_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.735083] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rescue_kernel_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.735245] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rescue_ramdisk_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.735413] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.735572] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.rx_queue_size = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.735742] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.smbfs_mount_options = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.735956] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.736131] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.snapshot_compression = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.736287] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.snapshot_image_format = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.736522] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.736688] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.sparse_logical_volumes = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.736844] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.swtpm_enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.737065] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.swtpm_group = tss {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.737186] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.swtpm_user = tss {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.737349] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.sysinfo_serial = unique {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.737526] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.tx_queue_size = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.737693] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.uid_maps = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.737853] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.use_virtio_for_bridges = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.738025] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.virt_type = kvm {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.738195] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.volume_clear = zero {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.738357] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.volume_clear_size = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.738518] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.volume_use_multipath = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.738675] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.vzstorage_cache_path = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.738838] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.739021] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.vzstorage_mount_group = qemu {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.739184] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.vzstorage_mount_opts = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.739346] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.739599] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.739788] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.vzstorage_mount_user = stack {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.739947] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.740149] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.740327] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.auth_type = password {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.740488] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.740642] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.740798] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.740972] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.connect_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.741143] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.connect_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.741308] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.default_floating_pool = public {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.741464] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.endpoint_override = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.741624] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.extension_sync_interval = 600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.741793] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.http_retries = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.741960] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.742121] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.742276] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.max_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.742458] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.742611] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.min_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.742769] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.ovs_bridge = br-int {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.742925] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.physnets = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.743112] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.region_name = RegionOne {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.743290] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.service_metadata_proxy = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.743449] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.service_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.743636] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.service_type = network {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.743810] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.743966] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.status_code_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.744140] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.status_code_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.744298] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.744473] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.744629] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] neutron.version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.744797] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] notifications.bdms_in_notifications = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.744975] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] notifications.default_level = INFO {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.745154] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] notifications.notification_format = unversioned {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.745313] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] notifications.notify_on_state_change = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.745481] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.745650] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] pci.alias = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.745812] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] pci.device_spec = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.746025] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] pci.report_in_placement = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.746767] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.746958] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.auth_type = password {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.747181] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.747309] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.747488] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.747660] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.747818] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.connect_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.747973] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.connect_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.748138] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.default_domain_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.748291] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.default_domain_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.748449] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.domain_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.748602] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.domain_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.748758] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.endpoint_override = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.748912] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.749101] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.749223] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.max_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.749371] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.min_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.749564] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.password = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.749721] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.project_domain_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.749882] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.project_domain_name = Default {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.750051] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.project_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.750220] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.project_name = service {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.750382] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.region_name = RegionOne {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.750539] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.service_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.750695] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.service_type = placement {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.750846] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.750996] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.status_code_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.751172] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.status_code_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.751327] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.system_scope = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.751477] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.751628] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.trust_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.751794] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.user_domain_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.751965] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.user_domain_name = Default {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.752131] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.user_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.752296] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.username = placement {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.752467] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.752623] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] placement.version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.752817] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.cores = 20 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.752976] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.count_usage_from_placement = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.753160] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.753344] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.injected_file_content_bytes = 10240 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.753504] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.injected_file_path_length = 255 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.753669] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.injected_files = 5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.753825] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.instances = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.753979] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.key_pairs = 100 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.754157] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.metadata_items = 128 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.754317] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.ram = 51200 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.754471] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.recheck_quota = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.754631] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.server_group_members = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.754786] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] quota.server_groups = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.754952] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rdp.enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.755275] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.755485] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.755694] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.755894] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.image_metadata_prefilter = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.756083] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.756277] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.max_attempts = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.756442] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.max_placement_results = 1000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.756607] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.756767] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.query_placement_for_availability_zone = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.756924] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.query_placement_for_image_type_support = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.757116] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.757312] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] scheduler.workers = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.757504] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.757677] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.757873] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.758094] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.758268] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.758429] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.758587] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.758790] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.758956] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.host_subset_size = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.759122] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.759284] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.759465] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.isolated_hosts = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.759654] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.isolated_images = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.759815] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.759972] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.760143] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.pci_in_placement = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.760300] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.760488] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.760642] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.760798] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.760952] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.761119] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.761312] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.track_instance_changes = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.761487] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.761658] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] metrics.required = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.761835] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] metrics.weight_multiplier = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.762011] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.762223] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] metrics.weight_setting = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.762524] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.762697] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] serial_console.enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.762890] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] serial_console.port_range = 10000:20000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.763071] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.763241] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.763403] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] serial_console.serialproxy_port = 6083 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.763566] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.763736] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.auth_type = password {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.763890] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.764051] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.764210] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.764364] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.764514] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.764708] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.send_service_user_token = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.764866] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.765029] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] service_user.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.765216] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.agent_enabled = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.765391] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.765709] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.765935] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.766116] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.html5proxy_port = 6082 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.766280] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.image_compression = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.766436] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.jpeg_compression = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.766590] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.playback_compression = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.766754] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.server_listen = 127.0.0.1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.766917] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.767082] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.streaming_mode = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.767238] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] spice.zlib_compression = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.767425] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] upgrade_levels.baseapi = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.767613] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] upgrade_levels.cert = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.767784] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] upgrade_levels.compute = auto {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.767940] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] upgrade_levels.conductor = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.768103] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] upgrade_levels.scheduler = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.768265] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.768428] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.auth_type = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.768581] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.768733] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.768927] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.769050] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.769203] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.769353] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.769531] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vendordata_dynamic_auth.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.769745] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.api_retry_count = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.769906] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.ca_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.770127] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.cache_prefix = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.770294] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.cluster_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.770451] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.connection_pool_size = 10 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.770606] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.console_delay_seconds = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.770759] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.datastore_regex = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.770917] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.host_ip = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.771080] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.host_password = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.771245] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.host_port = 443 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.771402] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.host_username = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.771558] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.771723] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.integration_bridge = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.771879] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.maximum_objects = 100 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.772042] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.pbm_default_policy = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.772202] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.pbm_enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.772629] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.pbm_wsdl_location = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.772629] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.772743] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.serial_port_proxy_uri = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.772796] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.serial_port_service_uri = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.772956] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.task_poll_interval = 0.5 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.773127] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.use_linked_clone = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.773290] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.vnc_keymap = en-us {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.773451] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.vnc_port = 5900 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.773609] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vmware.vnc_port_total = 10000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.773795] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.auth_schemes = ['none'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.773964] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.enabled = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.774286] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.774518] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.774654] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.novncproxy_port = 6080 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.774832] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.server_listen = 127.0.0.1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.774999] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.775172] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.vencrypt_ca_certs = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.775327] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.vencrypt_client_cert = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.775480] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] vnc.vencrypt_client_key = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.775686] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.775842] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.disable_deep_image_inspection = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.776020] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.776249] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.776418] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.776580] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.disable_rootwrap = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.776737] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.enable_numa_live_migration = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.776892] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.777058] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.777225] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.777378] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.libvirt_disable_apic = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.777562] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.777732] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.777890] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.778062] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.778223] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.778378] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.778530] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.778687] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.778843] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.778999] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.779195] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.779363] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.client_socket_timeout = 900 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.779572] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.default_pool_size = 1000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.779756] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.keep_alive = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.779924] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.max_header_line = 16384 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.780096] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.secure_proxy_ssl_header = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.780258] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.ssl_ca_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.780419] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.ssl_cert_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.780572] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.ssl_key_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.780729] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.tcp_keepidle = 600 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.780905] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.781082] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] zvm.ca_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.781241] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] zvm.cloud_connector_url = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.781458] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.781622] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] zvm.reachable_timeout = 300 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.781886] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.enforce_new_defaults = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.782085] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.enforce_scope = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.782286] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.policy_default_rule = default {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.782484] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.782673] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.policy_file = policy.yaml {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.782864] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.783050] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.783215] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.783394] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.783556] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.783748] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.783968] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.784187] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.connection_string = messaging:// {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.784373] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.enabled = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.784556] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.es_doc_type = notification {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.784737] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.es_scroll_size = 10000 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.784900] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.es_scroll_time = 2m {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.785067] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.filter_error_trace = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.785232] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.hmac_keys = SECRET_KEY {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.785396] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.sentinel_service_name = mymaster {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.785576] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.socket_timeout = 0.1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.785743] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] profiler.trace_sqlalchemy = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.785927] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] remote_debug.host = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.786113] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] remote_debug.port = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.786295] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.786457] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.786620] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.786772] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.786932] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.787094] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.787257] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.787417] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.787594] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.787766] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.787919] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.788096] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.788269] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.788433] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.788589] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.788780] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.788939] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.789107] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.789269] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.789455] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.789608] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.789771] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.789926] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.790097] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.790261] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.790424] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.ssl = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.790590] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.790753] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.790908] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.791086] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.791255] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_rabbit.ssl_version = {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.791466] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.791676] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_notifications.retry = -1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.791888] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.792081] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_messaging_notifications.transport_url = **** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.792290] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.auth_section = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.792456] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.auth_type = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.792613] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.cafile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.792764] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.certfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.792920] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.collect_timing = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.793140] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.connect_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.793295] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.connect_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.793545] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.endpoint_id = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.793545] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.endpoint_override = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.793669] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.insecure = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.793840] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.keyfile = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.794351] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.max_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.794351] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.min_version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.794351] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.region_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796370] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.service_name = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796370] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.service_type = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796370] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.split_loggers = False {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796370] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.status_code_retries = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796370] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.status_code_retry_delay = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796370] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.timeout = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796370] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.valid_interfaces = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796543] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_limit.version = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796543] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_reports.file_event_handler = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796543] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796543] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] oslo_reports.log_dir = None {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 381.796543] nova-conductor[52134]: DEBUG oslo_service.service [None req-a7b2a739-f1ec-4e1c-9471-b07e1eb256d7 None None] ******************************************************************************** {{(pid=52134) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 468.494586] nova-conductor[52553]: DEBUG oslo_db.sqlalchemy.engines [None req-09dc326d-9712-4591-af1e-691135a4f6a7 None None] Parent process 52134 forked (52553) with an open database connection, which is being discarded and recreated. {{(pid=52553) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 512.747083] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Took 0.54 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 512.770463] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.770746] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.772459] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.778061] nova-conductor[52553]: DEBUG oslo_db.sqlalchemy.engines [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52553) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 512.855817] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.856079] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.856875] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.856949] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.857225] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.857322] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.866029] nova-conductor[52553]: DEBUG oslo_db.sqlalchemy.engines [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52553) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 512.883436] nova-conductor[52553]: DEBUG nova.quota [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Getting quotas for project 96139b5146b84a968b5a8e9c51ada438. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 512.886665] nova-conductor[52553]: DEBUG nova.quota [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Getting quotas for user 0c27275e1557427f82256d40fed0934e and project 96139b5146b84a968b5a8e9c51ada438. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 512.893166] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 512.893799] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.893992] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.894441] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.900330] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 512.901200] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.902145] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.902145] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.936110] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.936369] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.939060] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.939060] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52553) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 512.939060] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52553) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 512.939060] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.939273] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.939273] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.939273] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 512.939273] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 512.939273] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 512.946677] nova-conductor[52553]: INFO nova.compute.rpcapi [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 512.947728] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-36dc807c-2fda-4c6e-b840-c9089a0781d9 None None] Releasing lock "compute-rpcapi-router" {{(pid=52553) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 514.737823] nova-conductor[52554]: DEBUG oslo_db.sqlalchemy.engines [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Parent process 52134 forked (52554) with an open database connection, which is being discarded and recreated. {{(pid=52554) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 514.956721] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 514.981690] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 514.982320] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 514.983875] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 514.989197] nova-conductor[52554]: DEBUG oslo_db.sqlalchemy.engines [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52554) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 515.048084] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.048351] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.049029] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.049372] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.049577] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.049745] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.059434] nova-conductor[52554]: DEBUG oslo_db.sqlalchemy.engines [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52554) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 515.073370] nova-conductor[52554]: DEBUG nova.quota [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Getting quotas for project a8331e6e26314ee3bd30dd7f6494daf4. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 515.076360] nova-conductor[52554]: DEBUG nova.quota [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Getting quotas for user b98ad87c6bf548458598d21c0d163b57 and project a8331e6e26314ee3bd30dd7f6494daf4. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 515.082984] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 515.083654] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.083786] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.083912] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.090315] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 515.091163] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.091401] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.091572] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.120574] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.121219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.122029] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.122029] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52554) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 515.122029] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52554) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 515.122425] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.122702] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.122807] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.123174] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.123354] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.123511] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.130862] nova-conductor[52554]: INFO nova.compute.rpcapi [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 515.131369] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-e7fee614-ca46-466e-97b2-5fb65eb48b29 None None] Releasing lock "compute-rpcapi-router" {{(pid=52554) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 515.637608] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 515.648407] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Took 0.25 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 515.665931] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.665931] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.666353] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.668621] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.669143] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.669143] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.703434] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.703723] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.703909] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.704264] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.704475] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.704638] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.707472] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.709864] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.709864] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.709864] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.709864] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.710357] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.714883] nova-conductor[52553]: DEBUG nova.quota [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Getting quotas for project a07d0346e8884cf394bb87ea702ec039. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 515.717030] nova-conductor[52553]: DEBUG nova.quota [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Getting quotas for user 9b20a4b99c3041d986483e1c4d1cbe79 and project a07d0346e8884cf394bb87ea702ec039. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 515.718351] nova-conductor[52554]: DEBUG nova.quota [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Getting quotas for project 7913858bdbbe4375917c0e1864ee8d2e. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 515.720844] nova-conductor[52554]: DEBUG nova.quota [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Getting quotas for user 588d0c5d584544c3be2d880de2c00a37 and project 7913858bdbbe4375917c0e1864ee8d2e. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 515.723801] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 515.724451] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.727203] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.727203] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.727877] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: 4e62d785-7c74-4d3a-9446-e690822d5386] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 515.728178] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 515.728665] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.728833] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.729629] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.730855] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.731079] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.731127] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.732832] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: ebc60b43-dc9e-4f3c-81c7-f65fe50be628] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 515.733669] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.733669] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.733833] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.756987] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.757276] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.760052] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 515.767231] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.767231] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.767231] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.046019] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 518.062034] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.062034] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.062034] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.123123] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.124580] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.124580] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.124580] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.124580] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.124766] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.133945] nova-conductor[52553]: DEBUG nova.quota [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Getting quotas for project 1d9184500ed74ad7bee0d5616a6dc843. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 518.135879] nova-conductor[52553]: DEBUG nova.quota [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Getting quotas for user c175fa3bb59a42898f1fe8ea193beb7c and project 1d9184500ed74ad7bee0d5616a6dc843. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 518.145900] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 518.146527] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.146649] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.146802] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.151680] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 518.154263] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.154263] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.154263] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.171645] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 518.171886] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 518.172070] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.101907] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 519.119978] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.120223] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.120421] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.156752] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.156752] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.157025] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.157407] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.161213] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.161474] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.004s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.172698] nova-conductor[52554]: DEBUG nova.quota [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Getting quotas for project da83cfec83084d7bbcc161d5d7b287c4. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 519.174842] nova-conductor[52554]: DEBUG nova.quota [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Getting quotas for user e5c4dcb488d34dda9a64f69fb9effb75 and project da83cfec83084d7bbcc161d5d7b287c4. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 519.180480] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 519.181051] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.182146] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.182146] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.184510] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 519.185103] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.185315] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.185673] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.199030] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.199030] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.199030] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 522.611017] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Took 0.20 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 522.628140] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.628420] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.628597] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 522.678741] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.681450] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.681450] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 522.681450] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.681450] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.681909] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 522.692690] nova-conductor[52554]: DEBUG nova.quota [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Getting quotas for project eb70c075cb2e4c44917d5ba6cb849786. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 522.698169] nova-conductor[52554]: DEBUG nova.quota [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Getting quotas for user b786da2369eb45ab916b9e137d644dc8 and project eb70c075cb2e4c44917d5ba6cb849786. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 522.704123] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 522.704123] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.704250] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.704458] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 522.713210] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 522.714250] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.714250] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.714382] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 522.732815] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 522.733466] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 522.733466] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.860690] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 528.874182] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.874758] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.875037] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.906885] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.907170] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.907351] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.907702] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.907888] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.908123] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.917245] nova-conductor[52553]: DEBUG nova.quota [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Getting quotas for project d3499ac5d4a9412e8e0d2db65c79c59c. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 528.919146] nova-conductor[52553]: DEBUG nova.quota [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Getting quotas for user 09f4cd92a283451d9c10fe5f370ffa48 and project d3499ac5d4a9412e8e0d2db65c79c59c. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 528.924446] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 528.925021] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.925272] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.925453] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.928423] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 528.929096] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.929308] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.929478] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.944759] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.944989] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.945178] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 532.221570] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Took 0.20 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 532.237474] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.238327] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.238327] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 532.269933] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.270356] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.270620] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 532.271249] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.271249] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.271518] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 532.282174] nova-conductor[52554]: DEBUG nova.quota [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Getting quotas for project d32baf4d3cbd4e7ba0286e667138fcf2. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 532.284676] nova-conductor[52554]: DEBUG nova.quota [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Getting quotas for user 4c0903f005d842aba82be2d553655b79 and project d32baf4d3cbd4e7ba0286e667138fcf2. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 532.291228] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 532.292101] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.292101] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.292331] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 532.296632] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 532.297323] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.298199] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.298199] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 532.311108] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 532.311151] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 532.311293] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.597858] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Took 0.21 seconds to select destinations for 2 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 533.613168] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.613168] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.613317] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.654379] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.654703] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.654902] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.692949] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.692949] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.692949] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.693334] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.693334] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.693394] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.706151] nova-conductor[52553]: DEBUG nova.quota [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Getting quotas for project c70189e619ac48ffaccbeb4f298abbe1. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 533.707758] nova-conductor[52553]: DEBUG nova.quota [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Getting quotas for user 5714eed2e79f4c12ace82daf1985577f and project c70189e619ac48ffaccbeb4f298abbe1. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 533.721845] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 533.721845] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.722084] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.722196] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.730453] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 533.731208] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.734360] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.734360] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.751580] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.751580] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.751580] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.757972] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 533.758484] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.758698] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.759457] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.770122] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 533.770913] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.771148] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.771325] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.805288] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.805519] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.805692] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 534.636381] nova-conductor[52553]: Traceback (most recent call last): [ 534.636381] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 534.636381] nova-conductor[52553]: return func(*args, **kwargs) [ 534.636381] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 534.636381] nova-conductor[52553]: selections = self._select_destinations( [ 534.636381] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 534.636381] nova-conductor[52553]: selections = self._schedule( [ 534.636381] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 534.636381] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 534.636381] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 534.636381] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 534.636381] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 534.636381] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 534.637138] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.637682] nova-conductor[52553]: ERROR nova.conductor.manager [ 534.651599] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 534.651974] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 534.652216] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 534.736649] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] [instance: 6829148d-8c80-4cbd-b6db-ed1855d4174e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 534.737383] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 534.737593] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 534.737759] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 534.745809] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 534.745809] nova-conductor[52553]: Traceback (most recent call last): [ 534.745809] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 534.745809] nova-conductor[52553]: return func(*args, **kwargs) [ 534.745809] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 534.745809] nova-conductor[52553]: selections = self._select_destinations( [ 534.745809] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 534.745809] nova-conductor[52553]: selections = self._schedule( [ 534.745809] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 534.745809] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 534.745809] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 534.745809] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 534.745809] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 534.745809] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 534.748016] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-3d6f2696-c02e-4d99-9225-f569a7878bde tempest-ServerActionsTestOtherB-856597999 tempest-ServerActionsTestOtherB-856597999-project-member] [instance: 6829148d-8c80-4cbd-b6db-ed1855d4174e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.423357] nova-conductor[52553]: Traceback (most recent call last): [ 536.423357] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.423357] nova-conductor[52553]: return func(*args, **kwargs) [ 536.423357] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.423357] nova-conductor[52553]: selections = self._select_destinations( [ 536.423357] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.423357] nova-conductor[52553]: selections = self._schedule( [ 536.423357] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.423357] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 536.423357] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.423357] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 536.423357] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 536.423357] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 536.424228] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.424777] nova-conductor[52553]: ERROR nova.conductor.manager [ 536.440489] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.440967] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.441664] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.503320] nova-conductor[52554]: Traceback (most recent call last): [ 536.503320] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.503320] nova-conductor[52554]: return func(*args, **kwargs) [ 536.503320] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.503320] nova-conductor[52554]: selections = self._select_destinations( [ 536.503320] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.503320] nova-conductor[52554]: selections = self._schedule( [ 536.503320] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.503320] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 536.503320] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.503320] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 536.503320] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 536.503320] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 536.504456] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.504994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.509733] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] [instance: 6054720c-ce85-4211-b9db-6fa6b5d6291f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 536.510714] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.511163] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.511388] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 536.511741] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.512030] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.512187] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 536.515330] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 536.515330] nova-conductor[52553]: Traceback (most recent call last): [ 536.515330] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.515330] nova-conductor[52553]: return func(*args, **kwargs) [ 536.515330] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.515330] nova-conductor[52553]: selections = self._select_destinations( [ 536.515330] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.515330] nova-conductor[52553]: selections = self._schedule( [ 536.515330] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.515330] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 536.515330] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.515330] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 536.515330] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.515330] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.515709] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-f55a4e7d-ad0c-4c5a-b8d9-d8b186d99991 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] [instance: 6054720c-ce85-4211-b9db-6fa6b5d6291f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.605458] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: 19276da7-bbeb-47f8-9d9e-e267034c2ad6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 536.606350] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.606350] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.606350] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 536.611855] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 536.611855] nova-conductor[52554]: Traceback (most recent call last): [ 536.611855] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.611855] nova-conductor[52554]: return func(*args, **kwargs) [ 536.611855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.611855] nova-conductor[52554]: selections = self._select_destinations( [ 536.611855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.611855] nova-conductor[52554]: selections = self._schedule( [ 536.611855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.611855] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 536.611855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.611855] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 536.611855] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.611855] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.612914] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9a159fc1-b7d9-44c4-ac35-26bd31342f37 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: 19276da7-bbeb-47f8-9d9e-e267034c2ad6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.766900] nova-conductor[52554]: Traceback (most recent call last): [ 536.766900] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.766900] nova-conductor[52554]: return func(*args, **kwargs) [ 536.766900] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.766900] nova-conductor[52554]: selections = self._select_destinations( [ 536.766900] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.766900] nova-conductor[52554]: selections = self._schedule( [ 536.766900] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.766900] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 536.766900] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.766900] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 536.766900] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 536.766900] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 536.768994] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.770297] nova-conductor[52554]: ERROR nova.conductor.manager [ 536.772178] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.772421] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.772598] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 536.850768] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] [instance: 226e7ed5-7ec6-4564-b88c-31c7753e9d60] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 536.854925] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.855164] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.855351] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 536.859238] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 536.859238] nova-conductor[52554]: Traceback (most recent call last): [ 536.859238] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 536.859238] nova-conductor[52554]: return func(*args, **kwargs) [ 536.859238] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 536.859238] nova-conductor[52554]: selections = self._select_destinations( [ 536.859238] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 536.859238] nova-conductor[52554]: selections = self._schedule( [ 536.859238] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 536.859238] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 536.859238] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 536.859238] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 536.859238] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 536.859238] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 536.860672] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-474a01ad-f140-4343-8653-4c902e581fbe tempest-ServerDiagnosticsV248Test-907938389 tempest-ServerDiagnosticsV248Test-907938389-project-member] [instance: 226e7ed5-7ec6-4564-b88c-31c7753e9d60] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 537.587029] nova-conductor[52554]: Traceback (most recent call last): [ 537.587029] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 537.587029] nova-conductor[52554]: return func(*args, **kwargs) [ 537.587029] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 537.587029] nova-conductor[52554]: selections = self._select_destinations( [ 537.587029] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 537.587029] nova-conductor[52554]: selections = self._schedule( [ 537.587029] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 537.587029] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 537.587029] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 537.587029] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 537.587029] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 537.587029] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 537.587784] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.588314] nova-conductor[52554]: ERROR nova.conductor.manager [ 537.596556] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.597348] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.597348] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.678177] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] [instance: 6791f0d3-b7af-4a3f-8d49-8dd401a95a1f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 537.679225] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.679225] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.679225] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.691324] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 537.691324] nova-conductor[52554]: Traceback (most recent call last): [ 537.691324] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 537.691324] nova-conductor[52554]: return func(*args, **kwargs) [ 537.691324] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 537.691324] nova-conductor[52554]: selections = self._select_destinations( [ 537.691324] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 537.691324] nova-conductor[52554]: selections = self._schedule( [ 537.691324] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 537.691324] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 537.691324] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 537.691324] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 537.691324] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 537.691324] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 537.693503] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-ff7cb8e1-a721-43fd-89b9-860f3db6a652 tempest-ImagesNegativeTestJSON-1710658454 tempest-ImagesNegativeTestJSON-1710658454-project-member] [instance: 6791f0d3-b7af-4a3f-8d49-8dd401a95a1f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 538.120871] nova-conductor[52554]: Traceback (most recent call last): [ 538.120871] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 538.120871] nova-conductor[52554]: return func(*args, **kwargs) [ 538.120871] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 538.120871] nova-conductor[52554]: selections = self._select_destinations( [ 538.120871] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 538.120871] nova-conductor[52554]: selections = self._schedule( [ 538.120871] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 538.120871] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 538.120871] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 538.120871] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 538.120871] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 538.120871] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 538.121883] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.122510] nova-conductor[52554]: ERROR nova.conductor.manager [ 538.126885] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 538.127335] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 538.127522] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 538.194179] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] [instance: a0411d73-cf00-4d1e-ba3d-3c148505f015] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 538.195234] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 538.195234] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 538.195486] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 538.201447] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 538.201447] nova-conductor[52554]: Traceback (most recent call last): [ 538.201447] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 538.201447] nova-conductor[52554]: return func(*args, **kwargs) [ 538.201447] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 538.201447] nova-conductor[52554]: selections = self._select_destinations( [ 538.201447] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 538.201447] nova-conductor[52554]: selections = self._schedule( [ 538.201447] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 538.201447] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 538.201447] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 538.201447] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 538.201447] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 538.201447] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 538.202030] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-038328a3-4db2-4867-9f30-b687e5864aaa tempest-FloatingIPsAssociationTestJSON-1605085979 tempest-FloatingIPsAssociationTestJSON-1605085979-project-member] [instance: a0411d73-cf00-4d1e-ba3d-3c148505f015] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 541.634138] nova-conductor[52553]: Traceback (most recent call last): [ 541.634138] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 541.634138] nova-conductor[52553]: return func(*args, **kwargs) [ 541.634138] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 541.634138] nova-conductor[52553]: selections = self._select_destinations( [ 541.634138] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 541.634138] nova-conductor[52553]: selections = self._schedule( [ 541.634138] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 541.634138] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 541.634138] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 541.634138] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 541.634138] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 541.634138] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 541.634931] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.635618] nova-conductor[52553]: ERROR nova.conductor.manager [ 541.647129] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.647360] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.647535] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.715935] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] [instance: 414c8db3-1552-48f8-9231-6b39f8d54b37] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 541.717232] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.717232] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.717232] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.721342] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 541.721342] nova-conductor[52553]: Traceback (most recent call last): [ 541.721342] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 541.721342] nova-conductor[52553]: return func(*args, **kwargs) [ 541.721342] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 541.721342] nova-conductor[52553]: selections = self._select_destinations( [ 541.721342] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 541.721342] nova-conductor[52553]: selections = self._schedule( [ 541.721342] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 541.721342] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 541.721342] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 541.721342] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 541.721342] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 541.721342] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 541.722035] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c4e8c592-64b2-491e-a697-f011ad0a0940 tempest-VolumesAssistedSnapshotsTest-1491090689 tempest-VolumesAssistedSnapshotsTest-1491090689-project-member] [instance: 414c8db3-1552-48f8-9231-6b39f8d54b37] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.133962] nova-conductor[52554]: Traceback (most recent call last): [ 544.133962] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 544.133962] nova-conductor[52554]: return func(*args, **kwargs) [ 544.133962] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 544.133962] nova-conductor[52554]: selections = self._select_destinations( [ 544.133962] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 544.133962] nova-conductor[52554]: selections = self._schedule( [ 544.133962] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 544.133962] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 544.133962] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 544.133962] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 544.133962] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 544.133962] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 544.135048] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.135563] nova-conductor[52554]: ERROR nova.conductor.manager [ 544.141609] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 544.141916] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 544.142117] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 544.184404] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] [instance: 432c69c7-6f6c-4790-8ef9-d405cc9bf158] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 544.185123] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 544.185339] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 544.185692] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 544.188445] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 544.188445] nova-conductor[52554]: Traceback (most recent call last): [ 544.188445] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 544.188445] nova-conductor[52554]: return func(*args, **kwargs) [ 544.188445] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 544.188445] nova-conductor[52554]: selections = self._select_destinations( [ 544.188445] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 544.188445] nova-conductor[52554]: selections = self._schedule( [ 544.188445] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 544.188445] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 544.188445] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 544.188445] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 544.188445] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 544.188445] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.189238] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-f4be070b-8111-4de1-8d26-83a281f34390 tempest-ServerMetadataNegativeTestJSON-1596432718 tempest-ServerMetadataNegativeTestJSON-1596432718-project-member] [instance: 432c69c7-6f6c-4790-8ef9-d405cc9bf158] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.450540] nova-conductor[52553]: Traceback (most recent call last): [ 544.450540] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 544.450540] nova-conductor[52553]: return func(*args, **kwargs) [ 544.450540] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 544.450540] nova-conductor[52553]: selections = self._select_destinations( [ 544.450540] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 544.450540] nova-conductor[52553]: selections = self._schedule( [ 544.450540] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 544.450540] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 544.450540] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 544.450540] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 544.450540] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 544.450540] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 544.451966] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.454175] nova-conductor[52553]: ERROR nova.conductor.manager [ 544.460915] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 544.460915] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 544.461183] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 544.529023] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] [instance: 395d134b-079c-4191-8fd7-48fd8cec163c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 544.529023] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 544.529023] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 544.529501] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 544.537018] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 544.537018] nova-conductor[52553]: Traceback (most recent call last): [ 544.537018] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 544.537018] nova-conductor[52553]: return func(*args, **kwargs) [ 544.537018] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 544.537018] nova-conductor[52553]: selections = self._select_destinations( [ 544.537018] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 544.537018] nova-conductor[52553]: selections = self._schedule( [ 544.537018] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 544.537018] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 544.537018] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 544.537018] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 544.537018] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 544.537018] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 544.537018] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c2296365-a38c-42ca-ab1d-660922a68cc3 tempest-ServersWithSpecificFlavorTestJSON-207981500 tempest-ServersWithSpecificFlavorTestJSON-207981500-project-member] [instance: 395d134b-079c-4191-8fd7-48fd8cec163c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 547.913282] nova-conductor[52554]: Traceback (most recent call last): [ 547.913282] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 547.913282] nova-conductor[52554]: return func(*args, **kwargs) [ 547.913282] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 547.913282] nova-conductor[52554]: selections = self._select_destinations( [ 547.913282] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 547.913282] nova-conductor[52554]: selections = self._schedule( [ 547.913282] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 547.913282] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 547.913282] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 547.913282] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 547.913282] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 547.913282] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 547.914058] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.914641] nova-conductor[52554]: ERROR nova.conductor.manager [ 547.925629] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 547.927049] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 547.927049] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 547.976211] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] [instance: 6145ade8-736b-4ad4-9988-d9a021308ecc] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 547.976364] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 547.976511] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 547.976681] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 547.985256] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 547.985256] nova-conductor[52554]: Traceback (most recent call last): [ 547.985256] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 547.985256] nova-conductor[52554]: return func(*args, **kwargs) [ 547.985256] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 547.985256] nova-conductor[52554]: selections = self._select_destinations( [ 547.985256] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 547.985256] nova-conductor[52554]: selections = self._schedule( [ 547.985256] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 547.985256] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 547.985256] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 547.985256] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 547.985256] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 547.985256] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 547.985825] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-927d6fb4-d4f5-469c-9198-e2f5818c01e6 tempest-VolumesAdminNegativeTest-875907274 tempest-VolumesAdminNegativeTest-875907274-project-member] [instance: 6145ade8-736b-4ad4-9988-d9a021308ecc] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 548.044610] nova-conductor[52553]: Traceback (most recent call last): [ 548.044610] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 548.044610] nova-conductor[52553]: return func(*args, **kwargs) [ 548.044610] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 548.044610] nova-conductor[52553]: selections = self._select_destinations( [ 548.044610] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 548.044610] nova-conductor[52553]: selections = self._schedule( [ 548.044610] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 548.044610] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 548.044610] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 548.044610] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 548.044610] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 548.044610] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 548.046741] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.047488] nova-conductor[52553]: ERROR nova.conductor.manager [ 548.059599] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 548.059599] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 548.059599] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 548.199670] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: a9a45897-e8de-4d6a-a42f-ce83d7f478ba] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 548.200451] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 548.200628] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 548.200831] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 548.208440] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 548.208440] nova-conductor[52553]: Traceback (most recent call last): [ 548.208440] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 548.208440] nova-conductor[52553]: return func(*args, **kwargs) [ 548.208440] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 548.208440] nova-conductor[52553]: selections = self._select_destinations( [ 548.208440] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 548.208440] nova-conductor[52553]: selections = self._schedule( [ 548.208440] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 548.208440] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 548.208440] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 548.208440] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 548.208440] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 548.208440] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 548.208985] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-4e9ac174-8f7d-4a10-b294-b73a62f52148 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: a9a45897-e8de-4d6a-a42f-ce83d7f478ba] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 552.375017] nova-conductor[52554]: Traceback (most recent call last): [ 552.375017] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 552.375017] nova-conductor[52554]: return func(*args, **kwargs) [ 552.375017] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 552.375017] nova-conductor[52554]: selections = self._select_destinations( [ 552.375017] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 552.375017] nova-conductor[52554]: selections = self._schedule( [ 552.375017] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 552.375017] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 552.375017] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 552.375017] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 552.375017] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 552.375017] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 552.375911] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.376467] nova-conductor[52554]: ERROR nova.conductor.manager [ 552.382655] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.382960] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.383188] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.445954] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] [instance: 03e064b3-75c7-4eba-8eb9-3d8cc21f7ac8] block_device_mapping [BlockDeviceMapping(attachment_id=d570a4ec-dbb4-4de8-a2da-96a961b10094,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='39034935-3c5c-4d44-9ada-b5921b8766f7',volume_size=1,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 552.445954] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.445954] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.446196] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.449771] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 552.449771] nova-conductor[52554]: Traceback (most recent call last): [ 552.449771] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 552.449771] nova-conductor[52554]: return func(*args, **kwargs) [ 552.449771] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 552.449771] nova-conductor[52554]: selections = self._select_destinations( [ 552.449771] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 552.449771] nova-conductor[52554]: selections = self._schedule( [ 552.449771] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 552.449771] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 552.449771] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 552.449771] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 552.449771] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 552.449771] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 552.449771] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-a354b68b-89ab-4643-9198-fccff9e1749d tempest-ServersTestBootFromVolume-1166592086 tempest-ServersTestBootFromVolume-1166592086-project-member] [instance: 03e064b3-75c7-4eba-8eb9-3d8cc21f7ac8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 560.666217] nova-conductor[52553]: Traceback (most recent call last): [ 560.666217] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 560.666217] nova-conductor[52553]: return func(*args, **kwargs) [ 560.666217] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 560.666217] nova-conductor[52553]: selections = self._select_destinations( [ 560.666217] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 560.666217] nova-conductor[52553]: selections = self._schedule( [ 560.666217] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 560.666217] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 560.666217] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 560.666217] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 560.666217] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 560.666217] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 560.667631] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.668191] nova-conductor[52553]: ERROR nova.conductor.manager [ 560.676768] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 560.676933] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 560.677127] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 560.721532] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: 49544334-4e4a-47d5-a164-5354b7493b88] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 560.722101] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 560.722318] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 560.722493] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 560.728444] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 560.728444] nova-conductor[52553]: Traceback (most recent call last): [ 560.728444] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 560.728444] nova-conductor[52553]: return func(*args, **kwargs) [ 560.728444] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 560.728444] nova-conductor[52553]: selections = self._select_destinations( [ 560.728444] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 560.728444] nova-conductor[52553]: selections = self._schedule( [ 560.728444] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 560.728444] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 560.728444] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 560.728444] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 560.728444] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 560.728444] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 560.729056] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-a2defa33-7f5d-46a6-9d91-348f73f4ee54 tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: 49544334-4e4a-47d5-a164-5354b7493b88] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.372357] nova-conductor[52554]: Traceback (most recent call last): [ 564.372357] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 564.372357] nova-conductor[52554]: return func(*args, **kwargs) [ 564.372357] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 564.372357] nova-conductor[52554]: selections = self._select_destinations( [ 564.372357] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 564.372357] nova-conductor[52554]: selections = self._schedule( [ 564.372357] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 564.372357] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 564.372357] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 564.372357] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 564.372357] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 564.372357] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 564.373567] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.374497] nova-conductor[52554]: ERROR nova.conductor.manager [ 564.380510] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.380510] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.380687] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.429010] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] [instance: 0bd301f2-9c59-472c-bc86-e57612f1f991] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 564.429909] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.430226] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.430368] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.435381] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 564.435381] nova-conductor[52554]: Traceback (most recent call last): [ 564.435381] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 564.435381] nova-conductor[52554]: return func(*args, **kwargs) [ 564.435381] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 564.435381] nova-conductor[52554]: selections = self._select_destinations( [ 564.435381] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 564.435381] nova-conductor[52554]: selections = self._schedule( [ 564.435381] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 564.435381] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 564.435381] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 564.435381] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 564.435381] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 564.435381] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.436214] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] [instance: 0bd301f2-9c59-472c-bc86-e57612f1f991] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.461025] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.461025] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.461025] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.504969] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] [instance: f7a62d62-d66d-450c-a003-b6a89e9894f4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 564.505558] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.505826] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.505936] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.509992] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 564.509992] nova-conductor[52554]: Traceback (most recent call last): [ 564.509992] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 564.509992] nova-conductor[52554]: return func(*args, **kwargs) [ 564.509992] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 564.509992] nova-conductor[52554]: selections = self._select_destinations( [ 564.509992] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 564.509992] nova-conductor[52554]: selections = self._schedule( [ 564.509992] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 564.509992] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 564.509992] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 564.509992] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 564.509992] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 564.509992] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.509992] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] [instance: f7a62d62-d66d-450c-a003-b6a89e9894f4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.535475] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.535705] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.535885] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.588800] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] [instance: b8d11415-26fc-4329-8e86-7772a0cd8f4d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 564.589650] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.589818] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.590076] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.595207] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 564.595207] nova-conductor[52554]: Traceback (most recent call last): [ 564.595207] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 564.595207] nova-conductor[52554]: return func(*args, **kwargs) [ 564.595207] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 564.595207] nova-conductor[52554]: selections = self._select_destinations( [ 564.595207] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 564.595207] nova-conductor[52554]: selections = self._schedule( [ 564.595207] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 564.595207] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 564.595207] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 564.595207] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 564.595207] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 564.595207] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.595747] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-92d21b6c-f8fa-46b9-b1d9-0d5c34d01aa0 tempest-ListServersNegativeTestJSON-593946779 tempest-ListServersNegativeTestJSON-593946779-project-member] [instance: b8d11415-26fc-4329-8e86-7772a0cd8f4d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.904450] nova-conductor[52553]: Traceback (most recent call last): [ 564.904450] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 564.904450] nova-conductor[52553]: return func(*args, **kwargs) [ 564.904450] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 564.904450] nova-conductor[52553]: selections = self._select_destinations( [ 564.904450] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 564.904450] nova-conductor[52553]: selections = self._schedule( [ 564.904450] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 564.904450] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 564.904450] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 564.904450] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 564.904450] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 564.904450] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 564.905161] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.905703] nova-conductor[52553]: ERROR nova.conductor.manager [ 564.917776] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.918030] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.918211] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 565.017405] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] [instance: 63ee2c7d-1fb0-4972-bb42-0d050dcb98a8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 565.018159] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 565.018384] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 565.018558] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 565.021956] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 565.021956] nova-conductor[52553]: Traceback (most recent call last): [ 565.021956] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 565.021956] nova-conductor[52553]: return func(*args, **kwargs) [ 565.021956] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 565.021956] nova-conductor[52553]: selections = self._select_destinations( [ 565.021956] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 565.021956] nova-conductor[52553]: selections = self._schedule( [ 565.021956] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 565.021956] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 565.021956] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 565.021956] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 565.021956] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 565.021956] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 565.022518] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-74e0bc35-0f28-453b-a023-1c16d936bc57 tempest-ServersAaction247Test-2075506948 tempest-ServersAaction247Test-2075506948-project-member] [instance: 63ee2c7d-1fb0-4972-bb42-0d050dcb98a8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 566.156355] nova-conductor[52554]: Traceback (most recent call last): [ 566.156355] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 566.156355] nova-conductor[52554]: return func(*args, **kwargs) [ 566.156355] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 566.156355] nova-conductor[52554]: selections = self._select_destinations( [ 566.156355] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 566.156355] nova-conductor[52554]: selections = self._schedule( [ 566.156355] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 566.156355] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 566.156355] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 566.156355] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 566.156355] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 566.156355] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 566.157165] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.157741] nova-conductor[52554]: ERROR nova.conductor.manager [ 566.167483] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.167705] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.168488] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.258035] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] [instance: 4d275bfc-79b1-43fe-bdbf-553201764aae] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 566.258722] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.258943] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.259139] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.264287] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 566.264287] nova-conductor[52554]: Traceback (most recent call last): [ 566.264287] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 566.264287] nova-conductor[52554]: return func(*args, **kwargs) [ 566.264287] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 566.264287] nova-conductor[52554]: selections = self._select_destinations( [ 566.264287] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 566.264287] nova-conductor[52554]: selections = self._schedule( [ 566.264287] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 566.264287] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 566.264287] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 566.264287] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 566.264287] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 566.264287] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 566.267103] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-983cf994-ecfe-455a-9392-30b85cab9cd7 tempest-ServersTestManualDisk-1963154159 tempest-ServersTestManualDisk-1963154159-project-member] [instance: 4d275bfc-79b1-43fe-bdbf-553201764aae] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 568.553122] nova-conductor[52553]: Traceback (most recent call last): [ 568.553122] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 568.553122] nova-conductor[52553]: return func(*args, **kwargs) [ 568.553122] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 568.553122] nova-conductor[52553]: selections = self._select_destinations( [ 568.553122] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 568.553122] nova-conductor[52553]: selections = self._schedule( [ 568.553122] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 568.553122] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 568.553122] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 568.553122] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 568.553122] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 568.553122] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 568.553944] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.554508] nova-conductor[52553]: ERROR nova.conductor.manager [ 568.562379] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 568.562757] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 568.562850] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 568.613985] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] [instance: 1963b819-7560-46b8-b98e-617bb99f7bc2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 568.614736] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 568.614942] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 568.615268] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 568.619373] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 568.619373] nova-conductor[52553]: Traceback (most recent call last): [ 568.619373] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 568.619373] nova-conductor[52553]: return func(*args, **kwargs) [ 568.619373] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 568.619373] nova-conductor[52553]: selections = self._select_destinations( [ 568.619373] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 568.619373] nova-conductor[52553]: selections = self._schedule( [ 568.619373] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 568.619373] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 568.619373] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 568.619373] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 568.619373] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 568.619373] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 568.619779] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-245886bd-5cee-4f01-b3e9-2635f023302c tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] [instance: 1963b819-7560-46b8-b98e-617bb99f7bc2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 570.825940] nova-conductor[52554]: Traceback (most recent call last): [ 570.825940] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 570.825940] nova-conductor[52554]: return func(*args, **kwargs) [ 570.825940] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 570.825940] nova-conductor[52554]: selections = self._select_destinations( [ 570.825940] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 570.825940] nova-conductor[52554]: selections = self._schedule( [ 570.825940] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 570.825940] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 570.825940] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 570.825940] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 570.825940] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 570.825940] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 570.826765] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.827320] nova-conductor[52554]: ERROR nova.conductor.manager [ 570.833621] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.834073] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.834156] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.879406] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] [instance: 81a9bbe4-aa7f-42b3-b14c-7c82123c46e2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 570.879406] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.879406] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.879659] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.886566] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 570.886566] nova-conductor[52554]: Traceback (most recent call last): [ 570.886566] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 570.886566] nova-conductor[52554]: return func(*args, **kwargs) [ 570.886566] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 570.886566] nova-conductor[52554]: selections = self._select_destinations( [ 570.886566] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 570.886566] nova-conductor[52554]: selections = self._schedule( [ 570.886566] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 570.886566] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 570.886566] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 570.886566] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 570.886566] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 570.886566] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 570.887114] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-1d5a8c6b-ce43-4083-8e69-042171382e02 tempest-ServerPasswordTestJSON-909579219 tempest-ServerPasswordTestJSON-909579219-project-member] [instance: 81a9bbe4-aa7f-42b3-b14c-7c82123c46e2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 572.224432] nova-conductor[52554]: ERROR nova.scheduler.utils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance a93c0169-490e-4cd2-b890-5e1d8aecae59 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 572.224955] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Rescheduling: True {{(pid=52554) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 572.225261] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a93c0169-490e-4cd2-b890-5e1d8aecae59.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a93c0169-490e-4cd2-b890-5e1d8aecae59. [ 572.225565] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9af18614-334f-48c8-a4ed-e0a7157bfe48 tempest-ServerShowV257Test-744124833 tempest-ServerShowV257Test-744124833-project-member] [instance: a93c0169-490e-4cd2-b890-5e1d8aecae59] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a93c0169-490e-4cd2-b890-5e1d8aecae59. [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 572.523642] nova-conductor[52554]: Traceback (most recent call last): [ 572.523642] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 572.523642] nova-conductor[52554]: return func(*args, **kwargs) [ 572.523642] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 572.523642] nova-conductor[52554]: selections = self._select_destinations( [ 572.523642] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 572.523642] nova-conductor[52554]: selections = self._schedule( [ 572.523642] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 572.523642] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 572.523642] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 572.523642] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 572.523642] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.523642] nova-conductor[52554]: ERROR nova.conductor.manager [ 572.543876] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.543876] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.543876] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.612258] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 27929949-feed-4c23-b36a-7ffe9df6c5e5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 572.612996] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.613227] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.613395] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.616770] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 572.616770] nova-conductor[52554]: Traceback (most recent call last): [ 572.616770] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 572.616770] nova-conductor[52554]: return func(*args, **kwargs) [ 572.616770] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 572.616770] nova-conductor[52554]: selections = self._select_destinations( [ 572.616770] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 572.616770] nova-conductor[52554]: selections = self._schedule( [ 572.616770] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 572.616770] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 572.616770] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 572.616770] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 572.616770] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 572.616770] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 572.617333] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-689faa1d-0f66-4c9e-a01a-27223e8b29b8 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 27929949-feed-4c23-b36a-7ffe9df6c5e5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 572.940362] nova-conductor[52553]: Traceback (most recent call last): [ 572.940362] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 572.940362] nova-conductor[52553]: return func(*args, **kwargs) [ 572.940362] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 572.940362] nova-conductor[52553]: selections = self._select_destinations( [ 572.940362] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 572.940362] nova-conductor[52553]: selections = self._schedule( [ 572.940362] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 572.940362] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 572.940362] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 572.940362] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 572.940362] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.940362] nova-conductor[52553]: ERROR nova.conductor.manager [ 572.950998] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.951273] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.951451] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 573.004255] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] [instance: c0e77e55-50ed-4008-b075-c0424c3457a1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 573.005010] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 573.008861] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 573.008861] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 573.009786] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 573.009786] nova-conductor[52553]: Traceback (most recent call last): [ 573.009786] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 573.009786] nova-conductor[52553]: return func(*args, **kwargs) [ 573.009786] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 573.009786] nova-conductor[52553]: selections = self._select_destinations( [ 573.009786] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 573.009786] nova-conductor[52553]: selections = self._schedule( [ 573.009786] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 573.009786] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 573.009786] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 573.009786] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 573.009786] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 573.009786] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 573.010271] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-b2acd679-2f79-4018-9bde-9d1a7c570973 tempest-ServersAdminTestJSON-1234628398 tempest-ServersAdminTestJSON-1234628398-project-member] [instance: c0e77e55-50ed-4008-b075-c0424c3457a1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.028198] nova-conductor[52553]: Traceback (most recent call last): [ 577.028198] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 577.028198] nova-conductor[52553]: return func(*args, **kwargs) [ 577.028198] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 577.028198] nova-conductor[52553]: selections = self._select_destinations( [ 577.028198] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 577.028198] nova-conductor[52553]: selections = self._schedule( [ 577.028198] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 577.028198] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 577.028198] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 577.028198] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 577.028198] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.028198] nova-conductor[52553]: ERROR nova.conductor.manager [ 577.038127] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.038127] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.038127] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.087028] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: d917d1b2-1971-4adc-890c-9038972bc119] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 577.087407] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.087622] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.087791] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.091022] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 577.091022] nova-conductor[52553]: Traceback (most recent call last): [ 577.091022] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 577.091022] nova-conductor[52553]: return func(*args, **kwargs) [ 577.091022] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 577.091022] nova-conductor[52553]: selections = self._select_destinations( [ 577.091022] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 577.091022] nova-conductor[52553]: selections = self._schedule( [ 577.091022] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 577.091022] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 577.091022] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 577.091022] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 577.091022] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 577.091022] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.091544] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c7a07268-e271-453e-acfa-c49744d7e3ce tempest-ImagesTestJSON-1412812746 tempest-ImagesTestJSON-1412812746-project-member] [instance: d917d1b2-1971-4adc-890c-9038972bc119] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 578.498512] nova-conductor[52554]: Traceback (most recent call last): [ 578.498512] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 578.498512] nova-conductor[52554]: return func(*args, **kwargs) [ 578.498512] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 578.498512] nova-conductor[52554]: selections = self._select_destinations( [ 578.498512] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 578.498512] nova-conductor[52554]: selections = self._schedule( [ 578.498512] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 578.498512] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 578.498512] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 578.498512] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 578.498512] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.498512] nova-conductor[52554]: ERROR nova.conductor.manager [ 578.511463] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 578.511463] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 578.511667] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 578.554908] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] [instance: 9c8a34dd-510f-45c6-b748-f441f21e0b70] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 578.555628] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 578.556413] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 578.556413] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 578.561415] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 578.561415] nova-conductor[52554]: Traceback (most recent call last): [ 578.561415] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 578.561415] nova-conductor[52554]: return func(*args, **kwargs) [ 578.561415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 578.561415] nova-conductor[52554]: selections = self._select_destinations( [ 578.561415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 578.561415] nova-conductor[52554]: selections = self._schedule( [ 578.561415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 578.561415] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 578.561415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 578.561415] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 578.561415] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 578.561415] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 578.565336] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-1431f013-cf4f-4b3f-a0f8-ce39b5d567d2 tempest-ServersTestFqdnHostnames-795084039 tempest-ServersTestFqdnHostnames-795084039-project-member] [instance: 9c8a34dd-510f-45c6-b748-f441f21e0b70] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.763178] nova-conductor[52553]: Traceback (most recent call last): [ 581.763178] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 581.763178] nova-conductor[52553]: return func(*args, **kwargs) [ 581.763178] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 581.763178] nova-conductor[52553]: selections = self._select_destinations( [ 581.763178] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 581.763178] nova-conductor[52553]: selections = self._schedule( [ 581.763178] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 581.763178] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 581.763178] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 581.763178] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 581.763178] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.763178] nova-conductor[52553]: ERROR nova.conductor.manager [ 581.775870] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 581.776123] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 581.776299] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 581.836871] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 0637688f-a7a4-4df1-a56b-553812f5b449] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 581.838375] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 581.838457] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 581.838737] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 581.842498] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 581.842498] nova-conductor[52553]: Traceback (most recent call last): [ 581.842498] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 581.842498] nova-conductor[52553]: return func(*args, **kwargs) [ 581.842498] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 581.842498] nova-conductor[52553]: selections = self._select_destinations( [ 581.842498] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 581.842498] nova-conductor[52553]: selections = self._schedule( [ 581.842498] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 581.842498] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 581.842498] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 581.842498] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 581.842498] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 581.842498] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.843129] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-fada62a4-d4b4-48ab-98a9-12a4b2741cb3 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 0637688f-a7a4-4df1-a56b-553812f5b449] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 583.042031] nova-conductor[52554]: Traceback (most recent call last): [ 583.042031] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 583.042031] nova-conductor[52554]: return func(*args, **kwargs) [ 583.042031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 583.042031] nova-conductor[52554]: selections = self._select_destinations( [ 583.042031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 583.042031] nova-conductor[52554]: selections = self._schedule( [ 583.042031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 583.042031] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 583.042031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 583.042031] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 583.042031] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.042031] nova-conductor[52554]: ERROR nova.conductor.manager [ 583.051036] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.051465] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.051772] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.114021] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] [instance: bbd77202-5206-425a-96f7-e69110c2d58c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 583.114021] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.114021] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.114021] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.116788] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 583.116788] nova-conductor[52554]: Traceback (most recent call last): [ 583.116788] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 583.116788] nova-conductor[52554]: return func(*args, **kwargs) [ 583.116788] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 583.116788] nova-conductor[52554]: selections = self._select_destinations( [ 583.116788] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 583.116788] nova-conductor[52554]: selections = self._schedule( [ 583.116788] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 583.116788] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 583.116788] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 583.116788] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 583.116788] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 583.116788] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 583.118118] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-93e55be2-2d6c-4ca6-8114-91c144aeb644 tempest-ImagesOneServerTestJSON-427840070 tempest-ImagesOneServerTestJSON-427840070-project-member] [instance: bbd77202-5206-425a-96f7-e69110c2d58c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 585.755052] nova-conductor[52553]: Traceback (most recent call last): [ 585.755052] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 585.755052] nova-conductor[52553]: return func(*args, **kwargs) [ 585.755052] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 585.755052] nova-conductor[52553]: selections = self._select_destinations( [ 585.755052] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 585.755052] nova-conductor[52553]: selections = self._schedule( [ 585.755052] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 585.755052] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 585.755052] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 585.755052] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 585.755052] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.755052] nova-conductor[52553]: ERROR nova.conductor.manager [ 585.762130] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.762390] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.762589] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.817681] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] [instance: f43715a1-50ec-448a-9a6d-4fda11ee2099] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 585.818723] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.818723] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.818842] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.822695] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 585.822695] nova-conductor[52553]: Traceback (most recent call last): [ 585.822695] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 585.822695] nova-conductor[52553]: return func(*args, **kwargs) [ 585.822695] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 585.822695] nova-conductor[52553]: selections = self._select_destinations( [ 585.822695] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 585.822695] nova-conductor[52553]: selections = self._schedule( [ 585.822695] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 585.822695] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 585.822695] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 585.822695] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 585.822695] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 585.822695] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 585.823270] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-02cfa338-ab06-4340-a422-ace14c369657 tempest-ServerActionsTestJSON-1901113518 tempest-ServerActionsTestJSON-1901113518-project-member] [instance: f43715a1-50ec-448a-9a6d-4fda11ee2099] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 586.626817] nova-conductor[52553]: Traceback (most recent call last): [ 586.626817] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 586.626817] nova-conductor[52553]: return func(*args, **kwargs) [ 586.626817] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 586.626817] nova-conductor[52553]: selections = self._select_destinations( [ 586.626817] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 586.626817] nova-conductor[52553]: selections = self._schedule( [ 586.626817] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 586.626817] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 586.626817] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 586.626817] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 586.626817] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.626817] nova-conductor[52553]: ERROR nova.conductor.manager [ 586.634116] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.634370] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.634545] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.687997] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] [instance: 38148aac-e7ce-467a-8d41-8493049511a4] block_device_mapping [BlockDeviceMapping(attachment_id=62b9023c-4c7b-4316-81d6-90e43c2dd5af,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='a7beb03a-c6eb-4350-9860-fa37eeef55ca',volume_size=1,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 586.688776] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.692018] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.692018] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.694764] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 586.694764] nova-conductor[52553]: Traceback (most recent call last): [ 586.694764] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 586.694764] nova-conductor[52553]: return func(*args, **kwargs) [ 586.694764] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 586.694764] nova-conductor[52553]: selections = self._select_destinations( [ 586.694764] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 586.694764] nova-conductor[52553]: selections = self._schedule( [ 586.694764] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 586.694764] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 586.694764] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 586.694764] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 586.694764] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 586.694764] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 586.695514] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-6b913048-ad61-47b9-95e2-ea3f66a9cc7e tempest-ServerActionsV293TestJSON-1631288998 tempest-ServerActionsV293TestJSON-1631288998-project-member] [instance: 38148aac-e7ce-467a-8d41-8493049511a4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 590.700384] nova-conductor[52554]: Traceback (most recent call last): [ 590.700384] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 590.700384] nova-conductor[52554]: return func(*args, **kwargs) [ 590.700384] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 590.700384] nova-conductor[52554]: selections = self._select_destinations( [ 590.700384] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 590.700384] nova-conductor[52554]: selections = self._schedule( [ 590.700384] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 590.700384] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 590.700384] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 590.700384] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 590.700384] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.700384] nova-conductor[52554]: ERROR nova.conductor.manager [ 590.710172] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 590.710172] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 590.710172] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 590.764990] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: b0ba526b-7004-45cd-877c-d21d4e88b703] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 590.764990] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 590.764990] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 590.764990] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 590.771831] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 590.771831] nova-conductor[52554]: Traceback (most recent call last): [ 590.771831] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 590.771831] nova-conductor[52554]: return func(*args, **kwargs) [ 590.771831] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 590.771831] nova-conductor[52554]: selections = self._select_destinations( [ 590.771831] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 590.771831] nova-conductor[52554]: selections = self._schedule( [ 590.771831] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 590.771831] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 590.771831] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 590.771831] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 590.771831] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 590.771831] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 590.772219] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-8608cac2-38ae-4f23-a50f-57b3a435aa63 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: b0ba526b-7004-45cd-877c-d21d4e88b703] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.658685] nova-conductor[52553]: Traceback (most recent call last): [ 591.658685] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 591.658685] nova-conductor[52553]: return func(*args, **kwargs) [ 591.658685] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 591.658685] nova-conductor[52553]: selections = self._select_destinations( [ 591.658685] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 591.658685] nova-conductor[52553]: selections = self._schedule( [ 591.658685] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 591.658685] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 591.658685] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 591.658685] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 591.658685] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.658685] nova-conductor[52553]: ERROR nova.conductor.manager [ 591.663745] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.663983] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.664175] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.717297] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] [instance: 4f739535-273b-4806-897e-ddbf12138e6a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 591.718463] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.718463] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.718463] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.723804] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 591.723804] nova-conductor[52553]: Traceback (most recent call last): [ 591.723804] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 591.723804] nova-conductor[52553]: return func(*args, **kwargs) [ 591.723804] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 591.723804] nova-conductor[52553]: selections = self._select_destinations( [ 591.723804] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 591.723804] nova-conductor[52553]: selections = self._schedule( [ 591.723804] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 591.723804] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 591.723804] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 591.723804] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 591.723804] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 591.723804] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.724238] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-53fd4e0c-6aba-438c-8326-d4c1be61ed35 tempest-ServerShowV254Test-666283222 tempest-ServerShowV254Test-666283222-project-member] [instance: 4f739535-273b-4806-897e-ddbf12138e6a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 593.585892] nova-conductor[52554]: Traceback (most recent call last): [ 593.585892] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 593.585892] nova-conductor[52554]: return func(*args, **kwargs) [ 593.585892] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 593.585892] nova-conductor[52554]: selections = self._select_destinations( [ 593.585892] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 593.585892] nova-conductor[52554]: selections = self._schedule( [ 593.585892] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 593.585892] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 593.585892] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 593.585892] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 593.585892] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.585892] nova-conductor[52554]: ERROR nova.conductor.manager [ 593.595775] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.596182] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.596369] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.650434] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] [instance: b894028f-c5d3-4c54-9f32-004b6fd1d926] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 593.651179] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.651397] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.651603] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.654977] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 593.654977] nova-conductor[52554]: Traceback (most recent call last): [ 593.654977] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 593.654977] nova-conductor[52554]: return func(*args, **kwargs) [ 593.654977] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 593.654977] nova-conductor[52554]: selections = self._select_destinations( [ 593.654977] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 593.654977] nova-conductor[52554]: selections = self._schedule( [ 593.654977] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 593.654977] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 593.654977] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 593.654977] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 593.654977] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 593.654977] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 593.655693] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-96c2102f-84ea-490e-bb1a-dac1e6c16021 tempest-InstanceActionsV221TestJSON-1158573269 tempest-InstanceActionsV221TestJSON-1158573269-project-member] [instance: b894028f-c5d3-4c54-9f32-004b6fd1d926] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 596.247414] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 596.262206] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.263030] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.263030] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.299338] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.299862] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.300169] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.300589] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.300915] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.301146] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.312533] nova-conductor[52554]: DEBUG nova.quota [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Getting quotas for project 750bcd4b13bb4da9937e127e5abc1201. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 596.315258] nova-conductor[52554]: DEBUG nova.quota [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Getting quotas for user a9e33dfaaf3a44f4b19c904d7f7d5be2 and project 750bcd4b13bb4da9937e127e5abc1201. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 596.323466] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 596.323982] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.324361] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.324456] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.328014] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 6874067b-8e9b-4242-9a5f-6312f1484a00] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 596.329181] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.329181] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.329181] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.349977] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.350356] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.350426] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.917654] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.917654] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.917654] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.096178] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 604.113904] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.114158] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.114330] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.151830] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.152338] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.152338] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.152713] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.154053] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.154053] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.163425] nova-conductor[52553]: DEBUG nova.quota [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Getting quotas for project e13da93f325a4e68ad89ac46dfcb196b. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 604.165832] nova-conductor[52553]: DEBUG nova.quota [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Getting quotas for user 599f90008649481b950c0d7600639837 and project e13da93f325a4e68ad89ac46dfcb196b. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 604.172468] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 604.173089] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.173303] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.173473] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.180199] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] [instance: f03f507b-364f-41b9-ad33-dcb56ab03317] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 604.182558] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.182558] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.182558] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.199720] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.200222] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.200222] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.546852] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 605.559668] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.559908] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.560101] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.589845] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.590092] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.590275] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.590625] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.590816] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.590977] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.601571] nova-conductor[52554]: DEBUG nova.quota [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Getting quotas for project 75310bd38faf4daea1ed2e141769a330. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 605.606965] nova-conductor[52554]: DEBUG nova.quota [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Getting quotas for user 76461c030e8e4c168de2b2924851a433 and project 75310bd38faf4daea1ed2e141769a330. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 605.610849] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 605.611360] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.611567] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.611771] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.614987] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 605.617376] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: ea4a243b-481f-421d-ba29-c88c828f754e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 605.618026] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.618246] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.618417] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.633320] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.633561] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.633740] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.637811] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.638042] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.638219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.665560] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.665798] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.665970] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.666970] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.666970] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.666970] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.678854] nova-conductor[52553]: DEBUG nova.quota [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Getting quotas for project 6caa6881daf74a08b946fafd73ae022e. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 605.682785] nova-conductor[52553]: DEBUG nova.quota [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Getting quotas for user 885501b7d394413b86aad917534c4eed and project 6caa6881daf74a08b946fafd73ae022e. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 605.689690] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 605.690359] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.690597] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.690773] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.693882] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] [instance: df997589-61b6-4f68-9169-e6f9bee650c7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 605.694575] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.694780] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.694956] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.709286] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.709588] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.709678] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.386797] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 606.401454] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.402042] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.402042] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.451813] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.453131] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.453131] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.453131] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.453131] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.453131] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.461658] nova-conductor[52554]: DEBUG nova.quota [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Getting quotas for project 1f0caf1650924bf0876200b6fa28527a. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 606.464400] nova-conductor[52554]: DEBUG nova.quota [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Getting quotas for user aa4017f6ffed49af9ac6a1bd51f92f1e and project 1f0caf1650924bf0876200b6fa28527a. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 606.471324] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] [instance: b62dda0a-da1d-4109-a925-bb32d01da242] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 606.471839] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.472067] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.472242] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.475760] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] [instance: b62dda0a-da1d-4109-a925-bb32d01da242] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 606.478141] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.478141] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.478141] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.493847] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.494595] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.494595] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.964577] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 607.987227] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.987227] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.987227] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.050826] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.051129] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.051235] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.051597] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.051805] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.051971] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.062951] nova-conductor[52553]: DEBUG nova.quota [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Getting quotas for project d1ebe0540da8498c815712744ca46d8c. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 608.065704] nova-conductor[52553]: DEBUG nova.quota [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Getting quotas for user e068660e46ae432299b076a47e878201 and project d1ebe0540da8498c815712744ca46d8c. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 608.072077] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] [instance: 6604de35-7683-4d5d-ac6f-13752ccb940c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 608.072402] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.072605] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.072803] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.078019] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] [instance: 6604de35-7683-4d5d-ac6f-13752ccb940c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 608.079147] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.079466] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.079770] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.097379] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.097379] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.097379] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.400024] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 609.420336] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.420623] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.421293] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.469219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.469219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.469219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.469219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.469219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.469219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.476642] nova-conductor[52554]: DEBUG nova.quota [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Getting quotas for project 970c34795c0548b7a6e5fa4c5111d765. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 609.480523] nova-conductor[52554]: DEBUG nova.quota [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Getting quotas for user c5d1b8048ae549c187120b357900ffd6 and project 970c34795c0548b7a6e5fa4c5111d765. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 609.494026] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 189903f4-37c9-4331-bb23-245ed68ecaae] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 609.494026] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.494026] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.494026] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.496356] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 189903f4-37c9-4331-bb23-245ed68ecaae] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 609.497466] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.497681] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.497858] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.516123] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.516312] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.516484] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.184727] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 610.207021] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.207021] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.207021] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.213879] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 610.229635] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.229895] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.230081] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.243920] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.244069] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.245034] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.245034] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.245034] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.245034] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.254343] nova-conductor[52554]: DEBUG nova.quota [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Getting quotas for project 5a7bd292f9494a83a490e9ecefad552d. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 610.258575] nova-conductor[52554]: DEBUG nova.quota [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Getting quotas for user 07b2bd65e84647609b8d23ff587d71bf and project 5a7bd292f9494a83a490e9ecefad552d. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 610.265337] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] [instance: 75cc0c18-27d3-4074-897b-08812a11829c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 610.265546] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.265777] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.265996] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.266216] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.266370] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.266765] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.267830] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.267830] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.267987] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.273277] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] [instance: 75cc0c18-27d3-4074-897b-08812a11829c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 610.273277] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.273277] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.273277] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.278088] nova-conductor[52553]: DEBUG nova.quota [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Getting quotas for project 5ed0e2edc8be422b895b459ed8dbef6b. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 610.281800] nova-conductor[52553]: DEBUG nova.quota [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Getting quotas for user bba253130c224b12a4c024eb941a3a47 and project 5ed0e2edc8be422b895b459ed8dbef6b. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 610.290226] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] [instance: 22aa54d4-80ec-4d56-9239-41810c469b9e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 610.291064] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.291064] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.291416] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.297825] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.298074] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.298259] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.299900] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] [instance: 22aa54d4-80ec-4d56-9239-41810c469b9e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 610.301397] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.301397] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.301397] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.316014] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.316243] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.316387] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.661115] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 610.673427] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.673661] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.673848] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.699453] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.699808] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.699932] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.700222] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.700413] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.700579] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.709092] nova-conductor[52553]: DEBUG nova.quota [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Getting quotas for project 970c34795c0548b7a6e5fa4c5111d765. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 610.717818] nova-conductor[52553]: DEBUG nova.quota [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Getting quotas for user c5d1b8048ae549c187120b357900ffd6 and project 970c34795c0548b7a6e5fa4c5111d765. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 610.722614] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 885fe65d-ee02-4ed7-8d59-109775086038] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 610.723153] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.723407] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.723868] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.732692] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: 885fe65d-ee02-4ed7-8d59-109775086038] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 610.733514] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.733653] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.733873] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.747332] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.747607] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.747726] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.856534] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 611.872243] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.872496] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.872674] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.892006] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 611.904121] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.904121] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.904121] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.906942] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.907234] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.907446] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.907850] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.908109] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.908347] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.916722] nova-conductor[52554]: DEBUG nova.quota [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Getting quotas for project 970c34795c0548b7a6e5fa4c5111d765. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 611.919135] nova-conductor[52554]: DEBUG nova.quota [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Getting quotas for user c5d1b8048ae549c187120b357900ffd6 and project 970c34795c0548b7a6e5fa4c5111d765. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 611.924936] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: ec879414-4534-4d0e-a65e-65baff80b16e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 611.925456] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.925663] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.925945] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.929588] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] [instance: ec879414-4534-4d0e-a65e-65baff80b16e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 611.930402] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.930498] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.930666] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.937115] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.937346] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.937577] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.938025] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.938221] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.938384] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.946024] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.946024] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.946024] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.946533] nova-conductor[52553]: DEBUG nova.quota [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Getting quotas for project 9f746698954f479ead437dc1b5bd15cd. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 611.948835] nova-conductor[52553]: DEBUG nova.quota [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Getting quotas for user c6e04e7912d740da8410822b968c784a and project 9f746698954f479ead437dc1b5bd15cd. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 611.954613] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: cde76b14-ee01-44c8-8004-39cdf91e9889] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 611.955108] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.956102] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.956102] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.959819] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: cde76b14-ee01-44c8-8004-39cdf91e9889] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 611.960493] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.960741] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.960950] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.976757] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.977286] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.977486] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.342441] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 612.355268] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.355501] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.355777] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.384473] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.384746] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.384877] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.385343] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.385539] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.385708] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.393563] nova-conductor[52554]: DEBUG nova.quota [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Getting quotas for project 9f746698954f479ead437dc1b5bd15cd. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 612.395893] nova-conductor[52554]: DEBUG nova.quota [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Getting quotas for user c6e04e7912d740da8410822b968c784a and project 9f746698954f479ead437dc1b5bd15cd. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 612.401876] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: ca732c56-b1d1-40bf-96b6-4b93bc5ff29d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 612.402388] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.402626] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.402769] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.406355] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] [instance: ca732c56-b1d1-40bf-96b6-4b93bc5ff29d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 612.406989] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.407213] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.407457] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.420325] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.420537] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.420712] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.571906] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 615.584924] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.584924] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.585029] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.621830] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.622089] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.622268] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.622618] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.623144] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.623361] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.630944] nova-conductor[52553]: DEBUG nova.quota [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Getting quotas for project 9ddd52dadf724da1b5d7b602ffd6f3c1. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 615.633555] nova-conductor[52553]: DEBUG nova.quota [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Getting quotas for user dd87d349a62e40aeb1df4bc0129b2f49 and project 9ddd52dadf724da1b5d7b602ffd6f3c1. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 615.641252] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] [instance: 6f0c0004-7fd2-49bf-bb1e-48774c481497] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 615.641747] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.641963] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.642152] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.644924] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] [instance: 6f0c0004-7fd2-49bf-bb1e-48774c481497] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 615.645598] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.645749] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.645918] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.661618] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.661861] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.662049] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.075636] nova-conductor[52553]: ERROR nova.scheduler.utils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 621.081018] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Rescheduling: True {{(pid=52553) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 621.081018] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee. [ 621.081018] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee. [ 621.120094] nova-conductor[52553]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] deallocate_for_instance() {{(pid=52553) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 621.267465] nova-conductor[52553]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Instance cache missing network info. {{(pid=52553) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.275614] nova-conductor[52553]: DEBUG nova.network.neutron [None req-3048873c-2770-4203-b51f-4df52d73f47e tempest-ServerDiagnosticsTest-107573575 tempest-ServerDiagnosticsTest-107573575-project-member] [instance: 43ec7ea9-b0a1-4c7f-85a5-1517a2dbf7ee] Updating instance_info_cache with network_info: [] {{(pid=52553) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.493455] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 627.509880] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.510153] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.510338] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.560665] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.561145] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.561145] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.561457] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.561647] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.562065] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.574519] nova-conductor[52554]: DEBUG nova.quota [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Getting quotas for project 50469143b9b441119f5bfcff560f3a9c. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 627.576997] nova-conductor[52554]: DEBUG nova.quota [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Getting quotas for user 4817f779374d427f8a2ad8e25b0d97f2 and project 50469143b9b441119f5bfcff560f3a9c. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 627.582493] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 627.583034] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.583242] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.583479] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.586952] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] [instance: 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.587644] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.587855] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.588039] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.603438] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.603670] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.603843] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.314567] nova-conductor[52554]: ERROR nova.scheduler.utils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance db1dd823-8349-4f34-9a8e-ecec90bd105b was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 667.314567] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Rescheduling: True {{(pid=52554) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 667.314567] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance db1dd823-8349-4f34-9a8e-ecec90bd105b.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance db1dd823-8349-4f34-9a8e-ecec90bd105b. [ 667.314567] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance db1dd823-8349-4f34-9a8e-ecec90bd105b. [ 667.341279] nova-conductor[52554]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] deallocate_for_instance() {{(pid=52554) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 667.476729] nova-conductor[52554]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Instance cache missing network info. {{(pid=52554) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 667.482107] nova-conductor[52554]: DEBUG nova.network.neutron [None req-deb37827-d8fc-4227-915c-1adfea93e03b tempest-ServerDiagnosticsNegativeTest-707153589 tempest-ServerDiagnosticsNegativeTest-707153589-project-member] [instance: db1dd823-8349-4f34-9a8e-ecec90bd105b] Updating instance_info_cache with network_info: [] {{(pid=52554) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.520019] nova-conductor[52553]: Traceback (most recent call last): [ 673.520019] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.520019] nova-conductor[52553]: return func(*args, **kwargs) [ 673.520019] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.520019] nova-conductor[52553]: selections = self._select_destinations( [ 673.520019] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.520019] nova-conductor[52553]: selections = self._schedule( [ 673.520019] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.520019] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 673.520019] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.520019] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 673.520019] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.520019] nova-conductor[52553]: ERROR nova.conductor.manager [ 673.531420] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.531420] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.531420] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.591617] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] [instance: d0d32dfc-cfcf-42aa-9fd4-5eca9970626c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 673.592352] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.592604] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.592784] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.595824] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 673.595824] nova-conductor[52553]: Traceback (most recent call last): [ 673.595824] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.595824] nova-conductor[52553]: return func(*args, **kwargs) [ 673.595824] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.595824] nova-conductor[52553]: selections = self._select_destinations( [ 673.595824] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.595824] nova-conductor[52553]: selections = self._schedule( [ 673.595824] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.595824] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 673.595824] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.595824] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 673.595824] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.595824] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.596921] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-2af1dcb0-4f01-4977-8208-3c3bfb483fab tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] [instance: d0d32dfc-cfcf-42aa-9fd4-5eca9970626c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.566122] nova-conductor[52554]: Traceback (most recent call last): [ 677.566122] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.566122] nova-conductor[52554]: return func(*args, **kwargs) [ 677.566122] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.566122] nova-conductor[52554]: selections = self._select_destinations( [ 677.566122] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.566122] nova-conductor[52554]: selections = self._schedule( [ 677.566122] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.566122] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 677.566122] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.566122] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 677.566122] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.566122] nova-conductor[52554]: ERROR nova.conductor.manager [ 677.575091] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.575091] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.575091] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.626027] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] [instance: 08de3c61-c52e-4854-9885-398764470d86] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 677.626027] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.626027] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.626027] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.629957] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 677.629957] nova-conductor[52554]: Traceback (most recent call last): [ 677.629957] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.629957] nova-conductor[52554]: return func(*args, **kwargs) [ 677.629957] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.629957] nova-conductor[52554]: selections = self._select_destinations( [ 677.629957] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.629957] nova-conductor[52554]: selections = self._schedule( [ 677.629957] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.629957] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 677.629957] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.629957] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 677.629957] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.629957] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.630490] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-b7fef49e-0b83-4d3f-9724-374f5ac2f9f6 tempest-ServersTestMultiNic-503120952 tempest-ServersTestMultiNic-503120952-project-member] [instance: 08de3c61-c52e-4854-9885-398764470d86] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.268962] nova-conductor[52553]: Traceback (most recent call last): [ 685.268962] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 685.268962] nova-conductor[52553]: return func(*args, **kwargs) [ 685.268962] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 685.268962] nova-conductor[52553]: selections = self._select_destinations( [ 685.268962] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 685.268962] nova-conductor[52553]: selections = self._schedule( [ 685.268962] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 685.268962] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 685.268962] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 685.268962] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 685.268962] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.268962] nova-conductor[52553]: ERROR nova.conductor.manager [ 685.280450] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.280450] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.280450] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.321757] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] [instance: e983a53d-d557-4710-a58d-4de61a7aafe6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 685.322522] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.322701] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.323026] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.325872] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 685.325872] nova-conductor[52553]: Traceback (most recent call last): [ 685.325872] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 685.325872] nova-conductor[52553]: return func(*args, **kwargs) [ 685.325872] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 685.325872] nova-conductor[52553]: selections = self._select_destinations( [ 685.325872] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 685.325872] nova-conductor[52553]: selections = self._schedule( [ 685.325872] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 685.325872] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 685.325872] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 685.325872] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 685.325872] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 685.325872] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.326443] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-92b27424-bf4f-477d-b634-9f04b700797d tempest-ServerMetadataTestJSON-718186560 tempest-ServerMetadataTestJSON-718186560-project-member] [instance: e983a53d-d557-4710-a58d-4de61a7aafe6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.310415] nova-conductor[52554]: Traceback (most recent call last): [ 689.310415] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.310415] nova-conductor[52554]: return func(*args, **kwargs) [ 689.310415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.310415] nova-conductor[52554]: selections = self._select_destinations( [ 689.310415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.310415] nova-conductor[52554]: selections = self._schedule( [ 689.310415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.310415] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 689.310415] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.310415] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 689.310415] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.310415] nova-conductor[52554]: ERROR nova.conductor.manager [ 689.320211] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.320449] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.320744] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.362221] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] [instance: 4fe80c68-c78f-48fc-9537-d1950cab6887] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 689.362969] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.363199] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.363372] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.369993] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 689.369993] nova-conductor[52554]: Traceback (most recent call last): [ 689.369993] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.369993] nova-conductor[52554]: return func(*args, **kwargs) [ 689.369993] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.369993] nova-conductor[52554]: selections = self._select_destinations( [ 689.369993] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.369993] nova-conductor[52554]: selections = self._schedule( [ 689.369993] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.369993] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 689.369993] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.369993] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 689.369993] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.369993] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.371833] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-0050b230-bf0a-4903-9996-79885a3b2d87 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] [instance: 4fe80c68-c78f-48fc-9537-d1950cab6887] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.534099] nova-conductor[52553]: Traceback (most recent call last): [ 689.534099] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.534099] nova-conductor[52553]: return func(*args, **kwargs) [ 689.534099] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.534099] nova-conductor[52553]: selections = self._select_destinations( [ 689.534099] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.534099] nova-conductor[52553]: selections = self._schedule( [ 689.534099] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.534099] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 689.534099] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.534099] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 689.534099] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.534099] nova-conductor[52553]: ERROR nova.conductor.manager [ 689.540965] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.541201] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.541375] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.585731] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] [instance: 074bfa3a-f3ba-45a0-80a0-62f40bd7031f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 689.586417] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.586633] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.586809] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.591090] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 689.591090] nova-conductor[52553]: Traceback (most recent call last): [ 689.591090] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.591090] nova-conductor[52553]: return func(*args, **kwargs) [ 689.591090] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.591090] nova-conductor[52553]: selections = self._select_destinations( [ 689.591090] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.591090] nova-conductor[52553]: selections = self._schedule( [ 689.591090] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.591090] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 689.591090] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.591090] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 689.591090] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.591090] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.591644] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-4bf148ca-50ca-4902-b828-5f90b0072f40 tempest-ServerShowV247Test-1614233829 tempest-ServerShowV247Test-1614233829-project-member] [instance: 074bfa3a-f3ba-45a0-80a0-62f40bd7031f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 714.505579] nova-conductor[52553]: DEBUG nova.db.main.api [None req-2113d3f9-d4fd-4954-b30d-2701ee0748c0 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created instance_extra for ebc60b43-dc9e-4f3c-81c7-f65fe50be628 {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 715.654268] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 715.665599] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.665991] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.665991] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.676764] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 715.686567] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.686972] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.687399] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.696998] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.697249] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.697418] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.698293] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.698519] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.698693] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.708924] nova-conductor[52554]: DEBUG nova.quota [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Getting quotas for project a07d0346e8884cf394bb87ea702ec039. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 715.714776] nova-conductor[52554]: DEBUG nova.quota [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Getting quotas for user 9b20a4b99c3041d986483e1c4d1cbe79 and project a07d0346e8884cf394bb87ea702ec039. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 715.718861] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 715.720065] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.720065] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.720065] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.722633] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] [instance: e84f3fe9-d377-4018-8874-972d1f888208] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 715.726022] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.726022] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.726139] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.726139] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.726139] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.726139] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.726139] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.726139] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.726308] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.734357] nova-conductor[52553]: DEBUG nova.quota [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Getting quotas for project 7913858bdbbe4375917c0e1864ee8d2e. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 715.736982] nova-conductor[52553]: DEBUG nova.quota [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Getting quotas for user 588d0c5d584544c3be2d880de2c00a37 and project 7913858bdbbe4375917c0e1864ee8d2e. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 715.738244] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.738586] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.738777] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.743158] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 715.743806] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.744168] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.744497] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.748072] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] [instance: 0d87148b-1493-4777-a8b3-b94a64e8eca6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 715.748883] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.749325] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.749929] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.763371] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.763803] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.764160] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.569565] nova-conductor[52553]: DEBUG nova.db.main.api [None req-adda8732-e35a-4a60-965c-1dba92160496 tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created instance_extra for 4e62d785-7c74-4d3a-9446-e690822d5386 {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.435548] nova-conductor[52553]: Traceback (most recent call last): [ 794.435548] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.435548] nova-conductor[52553]: return func(*args, **kwargs) [ 794.435548] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.435548] nova-conductor[52553]: selections = self._select_destinations( [ 794.435548] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.435548] nova-conductor[52553]: selections = self._schedule( [ 794.435548] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.435548] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 794.435548] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.435548] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 794.435548] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.435548] nova-conductor[52553]: ERROR nova.conductor.manager [ 794.441782] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 794.441903] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 794.442096] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 794.480586] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 762267db-4b69-48a1-87c0-dca716b3679c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 794.481321] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 794.481553] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 794.481727] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 794.484543] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 794.484543] nova-conductor[52553]: Traceback (most recent call last): [ 794.484543] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.484543] nova-conductor[52553]: return func(*args, **kwargs) [ 794.484543] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.484543] nova-conductor[52553]: selections = self._select_destinations( [ 794.484543] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.484543] nova-conductor[52553]: selections = self._schedule( [ 794.484543] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.484543] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 794.484543] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.484543] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 794.484543] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.484543] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.485322] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-a279e742-0644-4d61-be51-870e83cd9176 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 762267db-4b69-48a1-87c0-dca716b3679c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 796.270119] nova-conductor[52554]: Traceback (most recent call last): [ 796.270119] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 796.270119] nova-conductor[52554]: return func(*args, **kwargs) [ 796.270119] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 796.270119] nova-conductor[52554]: selections = self._select_destinations( [ 796.270119] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 796.270119] nova-conductor[52554]: selections = self._schedule( [ 796.270119] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 796.270119] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 796.270119] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 796.270119] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 796.270119] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.270119] nova-conductor[52554]: ERROR nova.conductor.manager [ 796.278058] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 796.278315] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 796.278493] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 796.316893] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: bf498dcd-5a65-4fc2-9f4c-9bfe75db8ae0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 796.317598] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 796.317813] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 796.317983] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 796.320731] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 796.320731] nova-conductor[52554]: Traceback (most recent call last): [ 796.320731] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 796.320731] nova-conductor[52554]: return func(*args, **kwargs) [ 796.320731] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 796.320731] nova-conductor[52554]: selections = self._select_destinations( [ 796.320731] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 796.320731] nova-conductor[52554]: selections = self._schedule( [ 796.320731] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 796.320731] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 796.320731] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 796.320731] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 796.320731] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 796.320731] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 796.321499] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-5425c8bf-e4b9-489a-915a-344efe53fc17 tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: bf498dcd-5a65-4fc2-9f4c-9bfe75db8ae0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 798.131041] nova-conductor[52553]: Traceback (most recent call last): [ 798.131041] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 798.131041] nova-conductor[52553]: return func(*args, **kwargs) [ 798.131041] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 798.131041] nova-conductor[52553]: selections = self._select_destinations( [ 798.131041] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 798.131041] nova-conductor[52553]: selections = self._schedule( [ 798.131041] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 798.131041] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 798.131041] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 798.131041] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 798.131041] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.131041] nova-conductor[52553]: ERROR nova.conductor.manager [ 798.138118] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.138358] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.138536] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.178951] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 2b979b25-32ec-4c37-92bc-9ceb0765d8ae] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 798.179753] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.179970] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.180158] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.184953] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 798.184953] nova-conductor[52553]: Traceback (most recent call last): [ 798.184953] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 798.184953] nova-conductor[52553]: return func(*args, **kwargs) [ 798.184953] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 798.184953] nova-conductor[52553]: selections = self._select_destinations( [ 798.184953] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 798.184953] nova-conductor[52553]: selections = self._schedule( [ 798.184953] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 798.184953] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 798.184953] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 798.184953] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 798.184953] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 798.184953] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 798.185480] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-84ce00ea-efa7-42e6-aa14-649a395731ae tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] [instance: 2b979b25-32ec-4c37-92bc-9ceb0765d8ae] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 805.035824] nova-conductor[52554]: Traceback (most recent call last): [ 805.035824] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 805.035824] nova-conductor[52554]: return func(*args, **kwargs) [ 805.035824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 805.035824] nova-conductor[52554]: selections = self._select_destinations( [ 805.035824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 805.035824] nova-conductor[52554]: selections = self._schedule( [ 805.035824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 805.035824] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 805.035824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 805.035824] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 805.035824] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.035824] nova-conductor[52554]: ERROR nova.conductor.manager [ 805.044586] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 805.044586] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.044586] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 805.118362] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 7af53c7d-ab29-4473-b1bf-79afabc1a32d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 805.118362] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 805.118362] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.118362] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 805.124031] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 805.124031] nova-conductor[52554]: Traceback (most recent call last): [ 805.124031] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 805.124031] nova-conductor[52554]: return func(*args, **kwargs) [ 805.124031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 805.124031] nova-conductor[52554]: selections = self._select_destinations( [ 805.124031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 805.124031] nova-conductor[52554]: selections = self._schedule( [ 805.124031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 805.124031] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 805.124031] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 805.124031] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 805.124031] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 805.124031] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 805.124529] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-91522dda-f2c2-420e-af40-182248369446 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 7af53c7d-ab29-4473-b1bf-79afabc1a32d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.078490] nova-conductor[52553]: Traceback (most recent call last): [ 807.078490] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 807.078490] nova-conductor[52553]: return func(*args, **kwargs) [ 807.078490] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 807.078490] nova-conductor[52553]: selections = self._select_destinations( [ 807.078490] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 807.078490] nova-conductor[52553]: selections = self._schedule( [ 807.078490] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 807.078490] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 807.078490] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 807.078490] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 807.078490] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.078490] nova-conductor[52553]: ERROR nova.conductor.manager [ 807.099860] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.099860] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.099860] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 807.185028] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: 870a683b-06a3-4dcb-a65f-ef4c14227225] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 807.185028] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.185028] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.185028] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 807.189209] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 807.189209] nova-conductor[52553]: Traceback (most recent call last): [ 807.189209] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 807.189209] nova-conductor[52553]: return func(*args, **kwargs) [ 807.189209] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 807.189209] nova-conductor[52553]: selections = self._select_destinations( [ 807.189209] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 807.189209] nova-conductor[52553]: selections = self._schedule( [ 807.189209] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 807.189209] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 807.189209] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 807.189209] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 807.189209] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 807.189209] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.189908] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-57b5a771-75d8-4517-8b37-df3dfc2cbbe9 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: 870a683b-06a3-4dcb-a65f-ef4c14227225] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.227951] nova-conductor[52554]: Traceback (most recent call last): [ 807.227951] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 807.227951] nova-conductor[52554]: return func(*args, **kwargs) [ 807.227951] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 807.227951] nova-conductor[52554]: selections = self._select_destinations( [ 807.227951] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 807.227951] nova-conductor[52554]: selections = self._schedule( [ 807.227951] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 807.227951] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 807.227951] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 807.227951] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 807.227951] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.227951] nova-conductor[52554]: ERROR nova.conductor.manager [ 807.244680] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.244680] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.247154] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 807.293809] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: c1652426-1d30-4f07-919a-bef9875c9e85] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 807.294585] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.294816] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.295118] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 807.299912] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 807.299912] nova-conductor[52554]: Traceback (most recent call last): [ 807.299912] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 807.299912] nova-conductor[52554]: return func(*args, **kwargs) [ 807.299912] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 807.299912] nova-conductor[52554]: selections = self._select_destinations( [ 807.299912] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 807.299912] nova-conductor[52554]: selections = self._schedule( [ 807.299912] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 807.299912] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 807.299912] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 807.299912] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 807.299912] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 807.299912] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 807.300756] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-f5a6a27d-a1f5-4982-8ad6-9d4c9bd70a63 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: c1652426-1d30-4f07-919a-bef9875c9e85] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 811.823204] nova-conductor[52553]: Traceback (most recent call last): [ 811.823204] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 811.823204] nova-conductor[52553]: return func(*args, **kwargs) [ 811.823204] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 811.823204] nova-conductor[52553]: selections = self._select_destinations( [ 811.823204] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 811.823204] nova-conductor[52553]: selections = self._schedule( [ 811.823204] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 811.823204] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 811.823204] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 811.823204] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 811.823204] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.823204] nova-conductor[52553]: ERROR nova.conductor.manager [ 811.833329] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 811.833329] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 811.833329] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.847996] nova-conductor[52553]: ERROR nova.scheduler.utils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 23984fc7-95de-43c3-a21e-894fab241dce was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 811.849148] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Rescheduling: True {{(pid=52553) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 811.849463] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 23984fc7-95de-43c3-a21e-894fab241dce.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 23984fc7-95de-43c3-a21e-894fab241dce. [ 811.849749] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 23984fc7-95de-43c3-a21e-894fab241dce. [ 811.878410] nova-conductor[52553]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] deallocate_for_instance() {{(pid=52553) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 811.912550] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: 2eed2e90-c676-4dc6-98c7-2c989b0d41c9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 811.913262] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 811.913475] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 811.913669] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.916553] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 811.916553] nova-conductor[52553]: Traceback (most recent call last): [ 811.916553] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 811.916553] nova-conductor[52553]: return func(*args, **kwargs) [ 811.916553] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 811.916553] nova-conductor[52553]: selections = self._select_destinations( [ 811.916553] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 811.916553] nova-conductor[52553]: selections = self._schedule( [ 811.916553] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 811.916553] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 811.916553] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 811.916553] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 811.916553] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 811.916553] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 811.917089] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-82adf047-d406-4aa9-85a4-2eb91c7dd22b tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: 2eed2e90-c676-4dc6-98c7-2c989b0d41c9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 811.923015] nova-conductor[52554]: DEBUG nova.db.main.api [None req-6f309cba-5afa-4dfc-86e4-f0361c865fd0 tempest-ImagesOneServerNegativeTestJSON-2067137125 tempest-ImagesOneServerNegativeTestJSON-2067137125-project-member] Created instance_extra for b62dda0a-da1d-4109-a925-bb32d01da242 {{(pid=52554) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 811.962275] nova-conductor[52553]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Instance cache missing network info. {{(pid=52553) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 811.965788] nova-conductor[52553]: DEBUG nova.network.neutron [None req-91fa722a-bd01-4d75-bbab-eda42ae56232 tempest-ServersAdminNegativeTestJSON-1483049067 tempest-ServersAdminNegativeTestJSON-1483049067-project-member] [instance: 23984fc7-95de-43c3-a21e-894fab241dce] Updating instance_info_cache with network_info: [] {{(pid=52553) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 811.999610] nova-conductor[52553]: DEBUG nova.db.main.api [None req-49a8e915-bce4-4384-b6b7-adba96e5e1e8 tempest-ServerActionsTestOtherA-1665339402 tempest-ServerActionsTestOtherA-1665339402-project-member] Created instance_extra for 6604de35-7683-4d5d-ac6f-13752ccb940c {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.075701] nova-conductor[52554]: DEBUG nova.db.main.api [None req-5718d835-b6db-483d-96db-232287d8ec49 tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Created instance_extra for 189903f4-37c9-4331-bb23-245ed68ecaae {{(pid=52554) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.149176] nova-conductor[52553]: DEBUG nova.db.main.api [None req-50852618-bfb2-45e1-9afb-62ea04595c1e tempest-ServersNegativeTestJSON-888982916 tempest-ServersNegativeTestJSON-888982916-project-member] Created instance_extra for 75cc0c18-27d3-4074-897b-08812a11829c {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.226029] nova-conductor[52553]: DEBUG nova.db.main.api [None req-ada24a6c-d679-430b-9e9c-803e8277e103 tempest-ServerAddressesTestJSON-1259181591 tempest-ServerAddressesTestJSON-1259181591-project-member] Created instance_extra for 22aa54d4-80ec-4d56-9239-41810c469b9e {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.341487] nova-conductor[52553]: DEBUG nova.db.main.api [None req-0f1e22a8-9fae-4bca-95a7-848e187055da tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Created instance_extra for 885fe65d-ee02-4ed7-8d59-109775086038 {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.427211] nova-conductor[52553]: DEBUG nova.db.main.api [None req-b194fefe-26fd-4f0e-aac8-c4c07268c0cc tempest-ListServerFiltersTestJSON-1556780718 tempest-ListServerFiltersTestJSON-1556780718-project-member] Created instance_extra for ec879414-4534-4d0e-a65e-65baff80b16e {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.500895] nova-conductor[52554]: DEBUG nova.db.main.api [None req-9954a97c-742a-40d5-b761-9e61e7e7d295 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Created instance_extra for cde76b14-ee01-44c8-8004-39cdf91e9889 {{(pid=52554) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.569223] nova-conductor[52553]: DEBUG nova.db.main.api [None req-bba4c956-14a4-4f9e-a55a-3dfb1205b3a3 tempest-ListImageFiltersTestJSON-946347293 tempest-ListImageFiltersTestJSON-946347293-project-member] Created instance_extra for ca732c56-b1d1-40bf-96b6-4b93bc5ff29d {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 812.655134] nova-conductor[52554]: DEBUG nova.db.main.api [None req-436aab8f-2d4a-47ba-b2c0-250864227759 tempest-TenantUsagesTestJSON-136728812 tempest-TenantUsagesTestJSON-136728812-project-member] Created instance_extra for 6f0c0004-7fd2-49bf-bb1e-48774c481497 {{(pid=52554) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 814.013060] nova-conductor[52554]: Traceback (most recent call last): [ 814.013060] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 814.013060] nova-conductor[52554]: return func(*args, **kwargs) [ 814.013060] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 814.013060] nova-conductor[52554]: selections = self._select_destinations( [ 814.013060] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 814.013060] nova-conductor[52554]: selections = self._schedule( [ 814.013060] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 814.013060] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 814.013060] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 814.013060] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 814.013060] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.013060] nova-conductor[52554]: ERROR nova.conductor.manager [ 814.020898] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 814.021155] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 814.021333] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 814.084664] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 9b97f9da-31a8-40cd-9da3-3b5e548b9ef0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 814.085013] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 814.085232] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 814.085411] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 814.092270] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 814.092270] nova-conductor[52554]: Traceback (most recent call last): [ 814.092270] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 814.092270] nova-conductor[52554]: return func(*args, **kwargs) [ 814.092270] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 814.092270] nova-conductor[52554]: selections = self._select_destinations( [ 814.092270] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 814.092270] nova-conductor[52554]: selections = self._schedule( [ 814.092270] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 814.092270] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 814.092270] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 814.092270] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 814.092270] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 814.092270] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 814.092798] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-54f18c0f-dbfb-49e2-a515-fe828c2544ff tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 9b97f9da-31a8-40cd-9da3-3b5e548b9ef0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.533136] nova-conductor[52553]: Traceback (most recent call last): [ 816.533136] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 816.533136] nova-conductor[52553]: return func(*args, **kwargs) [ 816.533136] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 816.533136] nova-conductor[52553]: selections = self._select_destinations( [ 816.533136] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 816.533136] nova-conductor[52553]: selections = self._schedule( [ 816.533136] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 816.533136] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 816.533136] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 816.533136] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 816.533136] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.533136] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.543022] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.543022] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.543022] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.591673] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: df8689d6-34fa-437f-9eaf-c37a7672c25a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 816.592581] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.592581] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.592831] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.598830] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 816.598830] nova-conductor[52553]: Traceback (most recent call last): [ 816.598830] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 816.598830] nova-conductor[52553]: return func(*args, **kwargs) [ 816.598830] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 816.598830] nova-conductor[52553]: selections = self._select_destinations( [ 816.598830] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 816.598830] nova-conductor[52553]: selections = self._schedule( [ 816.598830] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 816.598830] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 816.598830] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 816.598830] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 816.598830] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 816.598830] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.599417] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-b42fa7dd-2fa3-4290-b805-bf086aa82854 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: df8689d6-34fa-437f-9eaf-c37a7672c25a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.782973] nova-conductor[52553]: Traceback (most recent call last): [ 816.782973] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 816.782973] nova-conductor[52553]: return func(*args, **kwargs) [ 816.782973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 816.782973] nova-conductor[52553]: selections = self._select_destinations( [ 816.782973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 816.782973] nova-conductor[52553]: selections = self._schedule( [ 816.782973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 816.782973] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 816.782973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 816.782973] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 816.782973] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.782973] nova-conductor[52553]: ERROR nova.conductor.manager [ 816.791630] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.793719] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.793719] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.847813] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] [instance: 53d01198-b00e-408d-b08e-2abc77091da0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 816.849255] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 816.849686] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 816.849788] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 816.853714] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 816.853714] nova-conductor[52553]: Traceback (most recent call last): [ 816.853714] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 816.853714] nova-conductor[52553]: return func(*args, **kwargs) [ 816.853714] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 816.853714] nova-conductor[52553]: selections = self._select_destinations( [ 816.853714] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 816.853714] nova-conductor[52553]: selections = self._schedule( [ 816.853714] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 816.853714] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 816.853714] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 816.853714] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 816.853714] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 816.853714] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.854251] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-88930b03-16c2-4908-8909-ded8a3cbed0d tempest-ServerTagsTestJSON-1180953540 tempest-ServerTagsTestJSON-1180953540-project-member] [instance: 53d01198-b00e-408d-b08e-2abc77091da0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 818.172020] nova-conductor[52554]: Traceback (most recent call last): [ 818.172020] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 818.172020] nova-conductor[52554]: return func(*args, **kwargs) [ 818.172020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 818.172020] nova-conductor[52554]: selections = self._select_destinations( [ 818.172020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 818.172020] nova-conductor[52554]: selections = self._schedule( [ 818.172020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 818.172020] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 818.172020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 818.172020] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 818.172020] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.172020] nova-conductor[52554]: ERROR nova.conductor.manager [ 818.181258] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.181311] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.182363] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 818.229324] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 3f70699b-2ea9-4700-bad6-8a48abfcab5b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 818.230028] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.230240] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.230419] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 818.235958] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 818.235958] nova-conductor[52554]: Traceback (most recent call last): [ 818.235958] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 818.235958] nova-conductor[52554]: return func(*args, **kwargs) [ 818.235958] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 818.235958] nova-conductor[52554]: selections = self._select_destinations( [ 818.235958] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 818.235958] nova-conductor[52554]: selections = self._schedule( [ 818.235958] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 818.235958] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 818.235958] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 818.235958] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 818.235958] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 818.235958] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 818.236493] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-462f7ad4-5898-4cf3-af4a-303d1593ae79 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 3f70699b-2ea9-4700-bad6-8a48abfcab5b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 819.375667] nova-conductor[52553]: Traceback (most recent call last): [ 819.375667] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 819.375667] nova-conductor[52553]: return func(*args, **kwargs) [ 819.375667] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 819.375667] nova-conductor[52553]: selections = self._select_destinations( [ 819.375667] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 819.375667] nova-conductor[52553]: selections = self._schedule( [ 819.375667] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 819.375667] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 819.375667] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 819.375667] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 819.375667] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.375667] nova-conductor[52553]: ERROR nova.conductor.manager [ 819.386784] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 819.387033] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 819.387208] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 819.456126] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] [instance: 2f4f4601-de9f-4f41-93e9-a1526b2d1659] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 819.456410] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 819.456410] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 819.456569] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 819.464578] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 819.464578] nova-conductor[52553]: Traceback (most recent call last): [ 819.464578] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 819.464578] nova-conductor[52553]: return func(*args, **kwargs) [ 819.464578] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 819.464578] nova-conductor[52553]: selections = self._select_destinations( [ 819.464578] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 819.464578] nova-conductor[52553]: selections = self._schedule( [ 819.464578] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 819.464578] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 819.464578] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 819.464578] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 819.464578] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 819.464578] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 819.464578] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-2b58b7d1-88b2-4659-8a8c-1b807ec8776a tempest-AttachInterfacesUnderV243Test-1729314279 tempest-AttachInterfacesUnderV243Test-1729314279-project-member] [instance: 2f4f4601-de9f-4f41-93e9-a1526b2d1659] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 819.984020] nova-conductor[52554]: Traceback (most recent call last): [ 819.984020] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 819.984020] nova-conductor[52554]: return func(*args, **kwargs) [ 819.984020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 819.984020] nova-conductor[52554]: selections = self._select_destinations( [ 819.984020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 819.984020] nova-conductor[52554]: selections = self._schedule( [ 819.984020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 819.984020] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 819.984020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 819.984020] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 819.984020] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.984020] nova-conductor[52554]: ERROR nova.conductor.manager [ 819.998210] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 819.998210] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 819.998210] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.043834] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: 1445da40-483f-4094-87c4-6c5a2214d7af] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 820.045041] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.045416] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.045719] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.049155] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 820.049155] nova-conductor[52554]: Traceback (most recent call last): [ 820.049155] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 820.049155] nova-conductor[52554]: return func(*args, **kwargs) [ 820.049155] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 820.049155] nova-conductor[52554]: selections = self._select_destinations( [ 820.049155] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 820.049155] nova-conductor[52554]: selections = self._schedule( [ 820.049155] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 820.049155] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 820.049155] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 820.049155] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 820.049155] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 820.049155] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 820.051241] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-0956fc0d-5bbf-49a0-8394-d4ff67018b11 tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: 1445da40-483f-4094-87c4-6c5a2214d7af] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 821.392425] nova-conductor[52553]: Traceback (most recent call last): [ 821.392425] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 821.392425] nova-conductor[52553]: return func(*args, **kwargs) [ 821.392425] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 821.392425] nova-conductor[52553]: selections = self._select_destinations( [ 821.392425] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 821.392425] nova-conductor[52553]: selections = self._schedule( [ 821.392425] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 821.392425] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 821.392425] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 821.392425] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 821.392425] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.392425] nova-conductor[52553]: ERROR nova.conductor.manager [ 821.403142] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 821.403573] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 821.403802] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 821.453724] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] [instance: 835055fb-1b63-4895-8676-e52e617393ec] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 821.454988] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 821.455547] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 821.455778] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 821.459656] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 821.459656] nova-conductor[52553]: Traceback (most recent call last): [ 821.459656] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 821.459656] nova-conductor[52553]: return func(*args, **kwargs) [ 821.459656] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 821.459656] nova-conductor[52553]: selections = self._select_destinations( [ 821.459656] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 821.459656] nova-conductor[52553]: selections = self._schedule( [ 821.459656] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 821.459656] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 821.459656] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 821.459656] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 821.459656] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 821.459656] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 821.460357] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-64ea3b7d-3e5d-4e9f-89ef-aac8079f485d tempest-ServersNegativeTestMultiTenantJSON-1475797793 tempest-ServersNegativeTestMultiTenantJSON-1475797793-project-member] [instance: 835055fb-1b63-4895-8676-e52e617393ec] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 822.421016] nova-conductor[52554]: Traceback (most recent call last): [ 822.421016] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 822.421016] nova-conductor[52554]: return func(*args, **kwargs) [ 822.421016] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 822.421016] nova-conductor[52554]: selections = self._select_destinations( [ 822.421016] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 822.421016] nova-conductor[52554]: selections = self._schedule( [ 822.421016] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 822.421016] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 822.421016] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 822.421016] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 822.421016] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.421016] nova-conductor[52554]: ERROR nova.conductor.manager [ 822.437533] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 822.437797] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 822.437975] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 822.496273] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 29d7d52b-c973-416c-8e57-b85a71542a47] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 822.497118] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 822.497458] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 822.497657] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 822.501657] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 822.501657] nova-conductor[52554]: Traceback (most recent call last): [ 822.501657] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 822.501657] nova-conductor[52554]: return func(*args, **kwargs) [ 822.501657] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 822.501657] nova-conductor[52554]: selections = self._select_destinations( [ 822.501657] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 822.501657] nova-conductor[52554]: selections = self._schedule( [ 822.501657] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 822.501657] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 822.501657] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 822.501657] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 822.501657] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 822.501657] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 822.502821] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-7a2b4e44-d964-43dd-9d9b-2dcdbe58fba9 tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 29d7d52b-c973-416c-8e57-b85a71542a47] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 823.562864] nova-conductor[52553]: Traceback (most recent call last): [ 823.562864] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 823.562864] nova-conductor[52553]: return func(*args, **kwargs) [ 823.562864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 823.562864] nova-conductor[52553]: selections = self._select_destinations( [ 823.562864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 823.562864] nova-conductor[52553]: selections = self._schedule( [ 823.562864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 823.562864] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 823.562864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 823.562864] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 823.562864] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.562864] nova-conductor[52553]: ERROR nova.conductor.manager [ 823.569518] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 823.569891] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 823.569989] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 823.618449] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: ffdff847-fef6-453f-b652-57c397067b5e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 823.619278] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 823.619370] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 823.619518] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 823.622616] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 823.622616] nova-conductor[52553]: Traceback (most recent call last): [ 823.622616] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 823.622616] nova-conductor[52553]: return func(*args, **kwargs) [ 823.622616] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 823.622616] nova-conductor[52553]: selections = self._select_destinations( [ 823.622616] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 823.622616] nova-conductor[52553]: selections = self._schedule( [ 823.622616] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 823.622616] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 823.622616] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 823.622616] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 823.622616] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 823.622616] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 823.623253] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c8f7c8da-b50e-476e-beea-219712b4f37e tempest-ServerDiskConfigTestJSON-667820123 tempest-ServerDiskConfigTestJSON-667820123-project-member] [instance: ffdff847-fef6-453f-b652-57c397067b5e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 825.096855] nova-conductor[52554]: Traceback (most recent call last): [ 825.096855] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 825.096855] nova-conductor[52554]: return func(*args, **kwargs) [ 825.096855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 825.096855] nova-conductor[52554]: selections = self._select_destinations( [ 825.096855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 825.096855] nova-conductor[52554]: selections = self._schedule( [ 825.096855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 825.096855] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 825.096855] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 825.096855] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 825.096855] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.096855] nova-conductor[52554]: ERROR nova.conductor.manager [ 825.104947] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 825.105197] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 825.105365] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 825.150660] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 326e6624-2db7-4a63-b075-d24c4dd74730] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 825.151761] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 825.151980] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 825.152188] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 825.159101] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 825.159101] nova-conductor[52554]: Traceback (most recent call last): [ 825.159101] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 825.159101] nova-conductor[52554]: return func(*args, **kwargs) [ 825.159101] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 825.159101] nova-conductor[52554]: selections = self._select_destinations( [ 825.159101] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 825.159101] nova-conductor[52554]: selections = self._schedule( [ 825.159101] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 825.159101] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 825.159101] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 825.159101] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 825.159101] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 825.159101] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 825.161291] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-446dfe63-27ac-4e77-96ec-ad472ce7b54e tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] [instance: 326e6624-2db7-4a63-b075-d24c4dd74730] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 833.339132] nova-conductor[52554]: Traceback (most recent call last): [ 833.339132] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 833.339132] nova-conductor[52554]: return func(*args, **kwargs) [ 833.339132] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 833.339132] nova-conductor[52554]: selections = self._select_destinations( [ 833.339132] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 833.339132] nova-conductor[52554]: selections = self._schedule( [ 833.339132] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 833.339132] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 833.339132] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 833.339132] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 833.339132] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.339132] nova-conductor[52554]: ERROR nova.conductor.manager [ 833.345379] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 833.345610] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 833.345787] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 833.386642] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f0fc3472-42e8-42a3-b287-e0de87299c5b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 833.387297] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 833.387506] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 833.387678] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 833.390426] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 833.390426] nova-conductor[52554]: Traceback (most recent call last): [ 833.390426] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 833.390426] nova-conductor[52554]: return func(*args, **kwargs) [ 833.390426] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 833.390426] nova-conductor[52554]: selections = self._select_destinations( [ 833.390426] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 833.390426] nova-conductor[52554]: selections = self._schedule( [ 833.390426] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 833.390426] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 833.390426] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 833.390426] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 833.390426] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 833.390426] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 833.390928] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-1c697637-880a-40a9-9f00-0593fd04f088 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f0fc3472-42e8-42a3-b287-e0de87299c5b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 835.990760] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 836.020176] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.020413] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.020584] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.059798] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.059981] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.060181] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.060530] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.060715] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.060878] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.069225] nova-conductor[52554]: DEBUG nova.quota [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Getting quotas for project 581c2db844984c00bc0bad0475272109. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 836.071475] nova-conductor[52554]: DEBUG nova.quota [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Getting quotas for user 618b89e8d8134c66b8662bdf4ca06d5c and project 581c2db844984c00bc0bad0475272109. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 836.076787] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 836.077252] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.077452] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.077618] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.080303] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] [instance: f3566a4b-8fe0-4c85-9c45-7c67cfd30323] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 836.080906] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.081115] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.081284] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.092749] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.092948] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.093134] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 860.716888] nova-conductor[52554]: ERROR nova.scheduler.utils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance ce718fc3-6f75-49b9-8543-c953646ce0d9 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 860.717548] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Rescheduling: True {{(pid=52554) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 860.717788] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ce718fc3-6f75-49b9-8543-c953646ce0d9.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ce718fc3-6f75-49b9-8543-c953646ce0d9. [ 860.718122] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ce718fc3-6f75-49b9-8543-c953646ce0d9. [ 860.740903] nova-conductor[52554]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] deallocate_for_instance() {{(pid=52554) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 860.757980] nova-conductor[52554]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Instance cache missing network info. {{(pid=52554) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 860.761664] nova-conductor[52554]: DEBUG nova.network.neutron [None req-10258a28-7eca-45bc-b693-3fc602fe0a6a tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: ce718fc3-6f75-49b9-8543-c953646ce0d9] Updating instance_info_cache with network_info: [] {{(pid=52554) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 863.215020] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Took 0.10 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 863.226980] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.227239] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.227440] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.253157] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.253404] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.253608] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.254038] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.254248] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.254413] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.262670] nova-conductor[52554]: DEBUG nova.quota [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Getting quotas for project eb70c075cb2e4c44917d5ba6cb849786. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 863.264939] nova-conductor[52554]: DEBUG nova.quota [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Getting quotas for user b786da2369eb45ab916b9e137d644dc8 and project eb70c075cb2e4c44917d5ba6cb849786. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 863.270163] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 863.270555] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.270751] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.270917] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.275252] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: c0f7ff03-5203-418d-aa9e-420448e9dbfb] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 863.275837] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.276044] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.276219] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.287201] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.287392] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.287560] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.990453] nova-conductor[52553]: ERROR nova.scheduler.utils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 908.991155] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Rescheduling: True {{(pid=52553) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 908.991442] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3. [ 908.991709] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-0b0b8a8c-0d6c-41eb-8956-7b62ea0886f7 tempest-ServersAdmin275Test-1044309481 tempest-ServersAdmin275Test-1044309481-project-member] [instance: ae25fbd0-3770-43fc-9850-cdb2065b5ce3] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ae25fbd0-3770-43fc-9850-cdb2065b5ce3. [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 915.325212] nova-conductor[52553]: Traceback (most recent call last): [ 915.325212] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 915.325212] nova-conductor[52553]: return func(*args, **kwargs) [ 915.325212] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 915.325212] nova-conductor[52553]: selections = self._select_destinations( [ 915.325212] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 915.325212] nova-conductor[52553]: selections = self._schedule( [ 915.325212] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 915.325212] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 915.325212] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 915.325212] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 915.325212] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.325212] nova-conductor[52553]: ERROR nova.conductor.manager [ 915.334993] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 915.334993] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 915.334993] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.392694] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] [instance: 975b5459-fdb6-4194-85de-1b0c0115d53b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 915.393543] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 915.393764] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 915.393941] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.398015] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 915.398015] nova-conductor[52553]: Traceback (most recent call last): [ 915.398015] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 915.398015] nova-conductor[52553]: return func(*args, **kwargs) [ 915.398015] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 915.398015] nova-conductor[52553]: selections = self._select_destinations( [ 915.398015] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 915.398015] nova-conductor[52553]: selections = self._schedule( [ 915.398015] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 915.398015] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 915.398015] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 915.398015] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 915.398015] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 915.398015] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 915.398574] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-c5dd0f67-52fe-405e-a120-56679c3a1218 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] [instance: 975b5459-fdb6-4194-85de-1b0c0115d53b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 915.890020] nova-conductor[52554]: Traceback (most recent call last): [ 915.890020] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 915.890020] nova-conductor[52554]: return func(*args, **kwargs) [ 915.890020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 915.890020] nova-conductor[52554]: selections = self._select_destinations( [ 915.890020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 915.890020] nova-conductor[52554]: selections = self._schedule( [ 915.890020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 915.890020] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 915.890020] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 915.890020] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 915.890020] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.890020] nova-conductor[52554]: ERROR nova.conductor.manager [ 915.898706] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 915.899094] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 915.900777] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.968416] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] [instance: 62d8d399-e0e3-4557-af64-0caf823c6906] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 915.969299] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 915.969877] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 915.969877] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 915.974054] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 915.974054] nova-conductor[52554]: Traceback (most recent call last): [ 915.974054] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 915.974054] nova-conductor[52554]: return func(*args, **kwargs) [ 915.974054] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 915.974054] nova-conductor[52554]: selections = self._select_destinations( [ 915.974054] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 915.974054] nova-conductor[52554]: selections = self._schedule( [ 915.974054] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 915.974054] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 915.974054] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 915.974054] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 915.974054] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 915.974054] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 915.974054] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9778ba32-7af2-4eac-8717-07db5a69dbb2 tempest-ServerRescueNegativeTestJSON-515251200 tempest-ServerRescueNegativeTestJSON-515251200-project-member] [instance: 62d8d399-e0e3-4557-af64-0caf823c6906] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 917.044020] nova-conductor[52553]: Traceback (most recent call last): [ 917.044020] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 917.044020] nova-conductor[52553]: return func(*args, **kwargs) [ 917.044020] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 917.044020] nova-conductor[52553]: selections = self._select_destinations( [ 917.044020] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 917.044020] nova-conductor[52553]: selections = self._schedule( [ 917.044020] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 917.044020] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 917.044020] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 917.044020] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 917.044020] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.044020] nova-conductor[52553]: ERROR nova.conductor.manager [ 917.059048] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 917.059048] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 917.059048] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 917.146817] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] [instance: 15b9ac1a-5d95-41e9-a1c3-48884dcf16e5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 917.146817] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 917.146921] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 917.147079] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 917.153460] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 917.153460] nova-conductor[52553]: Traceback (most recent call last): [ 917.153460] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 917.153460] nova-conductor[52553]: return func(*args, **kwargs) [ 917.153460] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 917.153460] nova-conductor[52553]: selections = self._select_destinations( [ 917.153460] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 917.153460] nova-conductor[52553]: selections = self._schedule( [ 917.153460] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 917.153460] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 917.153460] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 917.153460] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 917.153460] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 917.153460] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 917.154039] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-06468cd9-4ab8-4742-b718-8768089eb035 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862 tempest-FloatingIPsAssociationNegativeTestJSON-1684262862-project-member] [instance: 15b9ac1a-5d95-41e9-a1c3-48884dcf16e5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.819372] nova-conductor[52554]: Traceback (most recent call last): [ 918.819372] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 918.819372] nova-conductor[52554]: return func(*args, **kwargs) [ 918.819372] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 918.819372] nova-conductor[52554]: selections = self._select_destinations( [ 918.819372] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 918.819372] nova-conductor[52554]: selections = self._schedule( [ 918.819372] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 918.819372] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 918.819372] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 918.819372] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 918.819372] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.819372] nova-conductor[52554]: ERROR nova.conductor.manager [ 918.826171] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 918.826413] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 918.826589] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 918.871773] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] [instance: 9ba4a84b-2dbf-4322-971b-0d6cb5b21f07] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 918.873339] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 918.873599] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 918.873781] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 918.876818] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 918.876818] nova-conductor[52554]: Traceback (most recent call last): [ 918.876818] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 918.876818] nova-conductor[52554]: return func(*args, **kwargs) [ 918.876818] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 918.876818] nova-conductor[52554]: selections = self._select_destinations( [ 918.876818] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 918.876818] nova-conductor[52554]: selections = self._schedule( [ 918.876818] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 918.876818] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 918.876818] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 918.876818] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 918.876818] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 918.876818] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 918.877357] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9510b341-a70c-4f04-b91f-e358e35cccb1 tempest-InstanceActionsNegativeTestJSON-2005392144 tempest-InstanceActionsNegativeTestJSON-2005392144-project-member] [instance: 9ba4a84b-2dbf-4322-971b-0d6cb5b21f07] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 929.496741] nova-conductor[52553]: Traceback (most recent call last): [ 929.496741] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 929.496741] nova-conductor[52553]: return func(*args, **kwargs) [ 929.496741] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 929.496741] nova-conductor[52553]: selections = self._select_destinations( [ 929.496741] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 929.496741] nova-conductor[52553]: selections = self._schedule( [ 929.496741] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 929.496741] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 929.496741] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 929.496741] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 929.496741] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.496741] nova-conductor[52553]: ERROR nova.conductor.manager [ 929.504061] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.504307] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.504620] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.546919] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] [instance: 90d1dfae-4870-47d3-aec5-0a028d20b71f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 929.547633] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.547848] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.548029] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.551092] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 929.551092] nova-conductor[52553]: Traceback (most recent call last): [ 929.551092] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 929.551092] nova-conductor[52553]: return func(*args, **kwargs) [ 929.551092] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 929.551092] nova-conductor[52553]: selections = self._select_destinations( [ 929.551092] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 929.551092] nova-conductor[52553]: selections = self._schedule( [ 929.551092] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 929.551092] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 929.551092] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 929.551092] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 929.551092] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 929.551092] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 929.551609] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-7eebab9f-d037-4787-b6c0-e7b186fbf5e9 tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] [instance: 90d1dfae-4870-47d3-aec5-0a028d20b71f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.113027] nova-conductor[52554]: Traceback (most recent call last): [ 933.113027] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.113027] nova-conductor[52554]: return func(*args, **kwargs) [ 933.113027] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.113027] nova-conductor[52554]: selections = self._select_destinations( [ 933.113027] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.113027] nova-conductor[52554]: selections = self._schedule( [ 933.113027] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.113027] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 933.113027] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.113027] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 933.113027] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.113027] nova-conductor[52554]: ERROR nova.conductor.manager [ 933.121412] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.121640] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.121815] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.162993] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] [instance: 67567d03-d3f9-4d91-bb86-00545d7ccd93] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 933.163669] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.163888] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.164471] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.167504] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 933.167504] nova-conductor[52554]: Traceback (most recent call last): [ 933.167504] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.167504] nova-conductor[52554]: return func(*args, **kwargs) [ 933.167504] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.167504] nova-conductor[52554]: selections = self._select_destinations( [ 933.167504] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.167504] nova-conductor[52554]: selections = self._schedule( [ 933.167504] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.167504] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 933.167504] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.167504] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 933.167504] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.167504] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.168039] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-687bf314-44f6-4d8d-a845-521e166ac51a tempest-InstanceActionsTestJSON-2091493268 tempest-InstanceActionsTestJSON-2091493268-project-member] [instance: 67567d03-d3f9-4d91-bb86-00545d7ccd93] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.621850] nova-conductor[52553]: Traceback (most recent call last): [ 933.621850] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.621850] nova-conductor[52553]: return func(*args, **kwargs) [ 933.621850] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.621850] nova-conductor[52553]: selections = self._select_destinations( [ 933.621850] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.621850] nova-conductor[52553]: selections = self._schedule( [ 933.621850] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.621850] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 933.621850] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.621850] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 933.621850] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.621850] nova-conductor[52553]: ERROR nova.conductor.manager [ 933.628865] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.629114] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.629292] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.668116] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] [instance: b86bcb13-a4cb-4d3b-a705-bfec85bab98a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 933.668255] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.668362] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.668536] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.672038] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 933.672038] nova-conductor[52553]: Traceback (most recent call last): [ 933.672038] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 933.672038] nova-conductor[52553]: return func(*args, **kwargs) [ 933.672038] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 933.672038] nova-conductor[52553]: selections = self._select_destinations( [ 933.672038] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 933.672038] nova-conductor[52553]: selections = self._schedule( [ 933.672038] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 933.672038] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 933.672038] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 933.672038] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 933.672038] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 933.672038] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 933.672038] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-5cb92aa3-dd77-4b2a-9580-13d61acf1dfc tempest-SecurityGroupsTestJSON-2074217135 tempest-SecurityGroupsTestJSON-2074217135-project-member] [instance: b86bcb13-a4cb-4d3b-a705-bfec85bab98a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.081366] nova-conductor[52554]: Traceback (most recent call last): [ 940.081366] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 940.081366] nova-conductor[52554]: return func(*args, **kwargs) [ 940.081366] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 940.081366] nova-conductor[52554]: selections = self._select_destinations( [ 940.081366] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 940.081366] nova-conductor[52554]: selections = self._schedule( [ 940.081366] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 940.081366] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 940.081366] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 940.081366] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 940.081366] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.081366] nova-conductor[52554]: ERROR nova.conductor.manager [ 940.088518] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 940.088748] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 940.090047] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 940.128472] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] [instance: 8c8cf939-4ff4-42ed-a9fd-1908e95ba35c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 940.129238] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 940.129454] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 940.129628] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 940.132325] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 940.132325] nova-conductor[52554]: Traceback (most recent call last): [ 940.132325] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 940.132325] nova-conductor[52554]: return func(*args, **kwargs) [ 940.132325] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 940.132325] nova-conductor[52554]: selections = self._select_destinations( [ 940.132325] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 940.132325] nova-conductor[52554]: selections = self._schedule( [ 940.132325] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 940.132325] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 940.132325] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 940.132325] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 940.132325] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 940.132325] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 940.132867] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-10469c58-cbdc-4189-b61d-aba3c030b185 tempest-ServerRescueTestJSON-337503008 tempest-ServerRescueTestJSON-337503008-project-member] [instance: 8c8cf939-4ff4-42ed-a9fd-1908e95ba35c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.414973] nova-conductor[52553]: Traceback (most recent call last): [ 946.414973] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 946.414973] nova-conductor[52553]: return func(*args, **kwargs) [ 946.414973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 946.414973] nova-conductor[52553]: selections = self._select_destinations( [ 946.414973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 946.414973] nova-conductor[52553]: selections = self._schedule( [ 946.414973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 946.414973] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 946.414973] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 946.414973] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 946.414973] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.414973] nova-conductor[52553]: ERROR nova.conductor.manager [ 946.421581] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.421817] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.422074] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.461779] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] [instance: 1068a8ce-c582-4613-954f-be4b11bd3453] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 946.462475] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.462736] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.463032] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.466218] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 946.466218] nova-conductor[52553]: Traceback (most recent call last): [ 946.466218] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 946.466218] nova-conductor[52553]: return func(*args, **kwargs) [ 946.466218] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 946.466218] nova-conductor[52553]: selections = self._select_destinations( [ 946.466218] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 946.466218] nova-conductor[52553]: selections = self._schedule( [ 946.466218] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 946.466218] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 946.466218] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 946.466218] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 946.466218] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 946.466218] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 946.466649] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-3500ee8a-a6e1-4269-9d66-1b7147f95646 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] [instance: 1068a8ce-c582-4613-954f-be4b11bd3453] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 948.872824] nova-conductor[52554]: Traceback (most recent call last): [ 948.872824] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 948.872824] nova-conductor[52554]: return func(*args, **kwargs) [ 948.872824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 948.872824] nova-conductor[52554]: selections = self._select_destinations( [ 948.872824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 948.872824] nova-conductor[52554]: selections = self._schedule( [ 948.872824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 948.872824] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 948.872824] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 948.872824] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 948.872824] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.872824] nova-conductor[52554]: ERROR nova.conductor.manager [ 948.879419] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.879641] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.879812] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.914788] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] [instance: d1f897a5-db67-47c0-94ea-7f53b39660ce] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 948.915446] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.915654] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.915821] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.918898] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 948.918898] nova-conductor[52554]: Traceback (most recent call last): [ 948.918898] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 948.918898] nova-conductor[52554]: return func(*args, **kwargs) [ 948.918898] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 948.918898] nova-conductor[52554]: selections = self._select_destinations( [ 948.918898] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 948.918898] nova-conductor[52554]: selections = self._schedule( [ 948.918898] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 948.918898] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 948.918898] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 948.918898] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 948.918898] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 948.918898] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 948.919437] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-c21170e9-3c79-4c2c-9147-5e2d582d856f tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] [instance: d1f897a5-db67-47c0-94ea-7f53b39660ce] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.375178] nova-conductor[52554]: Traceback (most recent call last): [ 951.375178] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 951.375178] nova-conductor[52554]: return func(*args, **kwargs) [ 951.375178] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 951.375178] nova-conductor[52554]: selections = self._select_destinations( [ 951.375178] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 951.375178] nova-conductor[52554]: selections = self._schedule( [ 951.375178] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 951.375178] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 951.375178] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 951.375178] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 951.375178] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.375178] nova-conductor[52554]: ERROR nova.conductor.manager [ 951.381714] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.381936] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.382126] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.420240] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] [instance: fc1a1038-af9b-4f60-ac57-4d7eba6bdef8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 951.420869] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.421092] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.421270] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.424070] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 951.424070] nova-conductor[52554]: Traceback (most recent call last): [ 951.424070] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 951.424070] nova-conductor[52554]: return func(*args, **kwargs) [ 951.424070] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 951.424070] nova-conductor[52554]: selections = self._select_destinations( [ 951.424070] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 951.424070] nova-conductor[52554]: selections = self._schedule( [ 951.424070] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 951.424070] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 951.424070] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 951.424070] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 951.424070] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 951.424070] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.425429] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-9e4513a8-254a-4cee-b216-40f3dd541627 tempest-AttachVolumeNegativeTest-1010005069 tempest-AttachVolumeNegativeTest-1010005069-project-member] [instance: fc1a1038-af9b-4f60-ac57-4d7eba6bdef8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 960.151954] nova-conductor[52553]: ERROR nova.scheduler.utils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 426f9016-4e69-4e46-87f6-a67f77da5dff was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 960.152856] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Rescheduling: True {{(pid=52553) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 960.152856] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 426f9016-4e69-4e46-87f6-a67f77da5dff.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 426f9016-4e69-4e46-87f6-a67f77da5dff. [ 960.153010] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 426f9016-4e69-4e46-87f6-a67f77da5dff. [ 960.177517] nova-conductor[52553]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] deallocate_for_instance() {{(pid=52553) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 960.308881] nova-conductor[52553]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Instance cache missing network info. {{(pid=52553) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 960.313408] nova-conductor[52553]: DEBUG nova.network.neutron [None req-e4af0805-3bc9-43c0-99bd-4cdee44030ac tempest-ServerExternalEventsTest-1949741007 tempest-ServerExternalEventsTest-1949741007-project-member] [instance: 426f9016-4e69-4e46-87f6-a67f77da5dff] Updating instance_info_cache with network_info: [] {{(pid=52553) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1007.910594] nova-conductor[52554]: ERROR nova.scheduler.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 27836d31-f379-4b4b-aed1-155f4a947779 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1007.911204] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Rescheduling: True {{(pid=52554) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1007.911437] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 27836d31-f379-4b4b-aed1-155f4a947779.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 27836d31-f379-4b4b-aed1-155f4a947779. [ 1007.911648] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 27836d31-f379-4b4b-aed1-155f4a947779. [ 1007.941988] nova-conductor[52554]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] deallocate_for_instance() {{(pid=52554) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1007.998359] nova-conductor[52554]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Instance cache missing network info. {{(pid=52554) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1008.002256] nova-conductor[52554]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 27836d31-f379-4b4b-aed1-155f4a947779] Updating instance_info_cache with network_info: [] {{(pid=52554) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1055.730133] nova-conductor[52553]: ERROR nova.scheduler.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance ef011071-c0e1-44e0-9940-285f2f45da67 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1055.732946] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Rescheduling: True {{(pid=52553) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1055.732946] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ef011071-c0e1-44e0-9940-285f2f45da67.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ef011071-c0e1-44e0-9940-285f2f45da67. [ 1055.732946] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ef011071-c0e1-44e0-9940-285f2f45da67. [ 1055.752816] nova-conductor[52553]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] deallocate_for_instance() {{(pid=52553) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1055.783710] nova-conductor[52553]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Instance cache missing network info. {{(pid=52553) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1055.787040] nova-conductor[52553]: DEBUG nova.network.neutron [None req-ed4c064e-096a-46b0-a8e3-7c60afd9ead5 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: ef011071-c0e1-44e0-9940-285f2f45da67] Updating instance_info_cache with network_info: [] {{(pid=52553) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1055.983391] nova-conductor[52554]: DEBUG nova.db.main.api [None req-7a57a820-6d71-4837-8066-3a74dbea2f2c tempest-DeleteServersTestJSON-478355548 tempest-DeleteServersTestJSON-478355548-project-member] Created instance_extra for 6874067b-8e9b-4242-9a5f-6312f1484a00 {{(pid=52554) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1056.712710] nova-conductor[52554]: DEBUG nova.db.main.api [None req-0ff6ff40-b46c-46b7-900d-7f606e617772 tempest-ServerGroupTestJSON-1664601793 tempest-ServerGroupTestJSON-1664601793-project-member] Created instance_extra for f03f507b-364f-41b9-ad33-dcb56ab03317 {{(pid=52554) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1057.153062] nova-conductor[52554]: Traceback (most recent call last): [ 1057.153062] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1057.153062] nova-conductor[52554]: return func(*args, **kwargs) [ 1057.153062] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1057.153062] nova-conductor[52554]: selections = self._select_destinations( [ 1057.153062] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1057.153062] nova-conductor[52554]: selections = self._schedule( [ 1057.153062] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1057.153062] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 1057.153062] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1057.153062] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 1057.153062] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.153062] nova-conductor[52554]: ERROR nova.conductor.manager [ 1057.160192] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.160410] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.160580] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.201271] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 2cd4487c-30bc-4b13-ab5e-0a000b537a84] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1057.201942] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.202212] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.202387] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.205021] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1057.205021] nova-conductor[52554]: Traceback (most recent call last): [ 1057.205021] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1057.205021] nova-conductor[52554]: return func(*args, **kwargs) [ 1057.205021] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1057.205021] nova-conductor[52554]: selections = self._select_destinations( [ 1057.205021] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1057.205021] nova-conductor[52554]: selections = self._schedule( [ 1057.205021] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1057.205021] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 1057.205021] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1057.205021] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 1057.205021] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1057.205021] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1057.205574] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 2cd4487c-30bc-4b13-ab5e-0a000b537a84] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1057.226436] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.226652] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.226837] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.262191] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 5049c203-f7eb-4601-8a7a-6accb278a9c9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1057.262821] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.263070] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.263252] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.265867] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1057.265867] nova-conductor[52554]: Traceback (most recent call last): [ 1057.265867] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1057.265867] nova-conductor[52554]: return func(*args, **kwargs) [ 1057.265867] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1057.265867] nova-conductor[52554]: selections = self._select_destinations( [ 1057.265867] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1057.265867] nova-conductor[52554]: selections = self._schedule( [ 1057.265867] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1057.265867] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 1057.265867] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1057.265867] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 1057.265867] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1057.265867] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1057.266407] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-fc39e67f-3859-4ea7-9706-cc23c2272b73 tempest-MultipleCreateTestJSON-359342352 tempest-MultipleCreateTestJSON-359342352-project-member] [instance: 5049c203-f7eb-4601-8a7a-6accb278a9c9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1057.441375] nova-conductor[52553]: DEBUG nova.db.main.api [None req-371dde63-be9b-4791-aed0-9952f41c246d tempest-ServersTestJSON-1853208137 tempest-ServersTestJSON-1853208137-project-member] Created instance_extra for ea4a243b-481f-421d-ba29-c88c828f754e {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1058.167806] nova-conductor[52553]: DEBUG nova.db.main.api [None req-6e22221b-a0f2-4bed-96ff-40f7ade0c12d tempest-AttachInterfacesV270Test-209270185 tempest-AttachInterfacesV270Test-209270185-project-member] Created instance_extra for df997589-61b6-4f68-9169-e6f9bee650c7 {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1058.886781] nova-conductor[52553]: DEBUG nova.db.main.api [None req-a86eed95-5979-48c1-a329-04be2070467d tempest-ServersTestJSON-1097912379 tempest-ServersTestJSON-1097912379-project-member] Created instance_extra for 0e99d8ab-6b62-4ea9-b7c9-06394fa93e09 {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1061.201996] nova-conductor[52553]: Traceback (most recent call last): [ 1061.201996] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1061.201996] nova-conductor[52553]: return func(*args, **kwargs) [ 1061.201996] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1061.201996] nova-conductor[52553]: selections = self._select_destinations( [ 1061.201996] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1061.201996] nova-conductor[52553]: selections = self._schedule( [ 1061.201996] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1061.201996] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 1061.201996] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1061.201996] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 1061.201996] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.201996] nova-conductor[52553]: ERROR nova.conductor.manager [ 1061.208926] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1061.209187] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1061.209363] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1061.252744] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: 9cc5100d-3f93-4aac-8eb8-71776f8d765d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1061.253481] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1061.253697] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1061.253870] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1061.257153] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1061.257153] nova-conductor[52553]: Traceback (most recent call last): [ 1061.257153] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1061.257153] nova-conductor[52553]: return func(*args, **kwargs) [ 1061.257153] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1061.257153] nova-conductor[52553]: selections = self._select_destinations( [ 1061.257153] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1061.257153] nova-conductor[52553]: selections = self._schedule( [ 1061.257153] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1061.257153] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 1061.257153] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1061.257153] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 1061.257153] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1061.257153] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1061.257670] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-a4c7b854-69d7-4ec9-b135-287fbeac9944 tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] [instance: 9cc5100d-3f93-4aac-8eb8-71776f8d765d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1064.464589] nova-conductor[52554]: Traceback (most recent call last): [ 1064.464589] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1064.464589] nova-conductor[52554]: return func(*args, **kwargs) [ 1064.464589] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1064.464589] nova-conductor[52554]: selections = self._select_destinations( [ 1064.464589] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1064.464589] nova-conductor[52554]: selections = self._schedule( [ 1064.464589] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1064.464589] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 1064.464589] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1064.464589] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 1064.464589] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.464589] nova-conductor[52554]: ERROR nova.conductor.manager [ 1064.471611] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1064.471914] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1064.472113] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1064.516147] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] [instance: 3ddae09d-0924-4332-a7ea-c81a56bc6d95] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1064.517680] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1064.517680] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1064.517680] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1064.520143] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1064.520143] nova-conductor[52554]: Traceback (most recent call last): [ 1064.520143] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1064.520143] nova-conductor[52554]: return func(*args, **kwargs) [ 1064.520143] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1064.520143] nova-conductor[52554]: selections = self._select_destinations( [ 1064.520143] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1064.520143] nova-conductor[52554]: selections = self._schedule( [ 1064.520143] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1064.520143] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 1064.520143] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1064.520143] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 1064.520143] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1064.520143] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1064.521491] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-285a127f-efeb-407b-88b1-8d8daa088884 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] [instance: 3ddae09d-0924-4332-a7ea-c81a56bc6d95] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1067.242864] nova-conductor[52553]: Traceback (most recent call last): [ 1067.242864] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1067.242864] nova-conductor[52553]: return func(*args, **kwargs) [ 1067.242864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1067.242864] nova-conductor[52553]: selections = self._select_destinations( [ 1067.242864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1067.242864] nova-conductor[52553]: selections = self._schedule( [ 1067.242864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1067.242864] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 1067.242864] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1067.242864] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 1067.242864] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager result = self.transport._send( [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager raise result [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager selections = self._schedule( [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.242864] nova-conductor[52553]: ERROR nova.conductor.manager [ 1067.258027] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1067.258027] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1067.258027] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1067.302182] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] [instance: 534e93b0-9048-4397-a5d5-5664f2ebf05d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1067.302871] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1067.303103] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1067.303281] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1067.306169] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1067.306169] nova-conductor[52553]: Traceback (most recent call last): [ 1067.306169] nova-conductor[52553]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1067.306169] nova-conductor[52553]: return func(*args, **kwargs) [ 1067.306169] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1067.306169] nova-conductor[52553]: selections = self._select_destinations( [ 1067.306169] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1067.306169] nova-conductor[52553]: selections = self._schedule( [ 1067.306169] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1067.306169] nova-conductor[52553]: self._ensure_sufficient_hosts( [ 1067.306169] nova-conductor[52553]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1067.306169] nova-conductor[52553]: raise exception.NoValidHost(reason=reason) [ 1067.306169] nova-conductor[52553]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1067.306169] nova-conductor[52553]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1067.306733] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-628a086a-7dd0-4e0a-8b6e-2958e0218ab3 tempest-AttachVolumeShelveTestJSON-60207750 tempest-AttachVolumeShelveTestJSON-60207750-project-member] [instance: 534e93b0-9048-4397-a5d5-5664f2ebf05d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1068.067901] nova-conductor[52554]: Traceback (most recent call last): [ 1068.067901] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1068.067901] nova-conductor[52554]: return func(*args, **kwargs) [ 1068.067901] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1068.067901] nova-conductor[52554]: selections = self._select_destinations( [ 1068.067901] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1068.067901] nova-conductor[52554]: selections = self._schedule( [ 1068.067901] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1068.067901] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 1068.067901] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1068.067901] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 1068.067901] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager result = self.transport._send( [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager raise result [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager selections = self._schedule( [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.067901] nova-conductor[52554]: ERROR nova.conductor.manager [ 1068.074647] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1068.074873] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1068.075060] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1068.113706] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] [instance: 52aae65b-6003-4692-a38a-63130b2a1151] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1068.114418] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1068.114633] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1068.114837] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1068.117658] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1068.117658] nova-conductor[52554]: Traceback (most recent call last): [ 1068.117658] nova-conductor[52554]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1068.117658] nova-conductor[52554]: return func(*args, **kwargs) [ 1068.117658] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1068.117658] nova-conductor[52554]: selections = self._select_destinations( [ 1068.117658] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1068.117658] nova-conductor[52554]: selections = self._schedule( [ 1068.117658] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1068.117658] nova-conductor[52554]: self._ensure_sufficient_hosts( [ 1068.117658] nova-conductor[52554]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1068.117658] nova-conductor[52554]: raise exception.NoValidHost(reason=reason) [ 1068.117658] nova-conductor[52554]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1068.117658] nova-conductor[52554]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1068.118431] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-b4a2cf26-b4a6-45b1-9021-e363f20a3105 tempest-ServersV294TestFqdnHostnames-875455869 tempest-ServersV294TestFqdnHostnames-875455869-project-member] [instance: 52aae65b-6003-4692-a38a-63130b2a1151] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1074.478119] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52553) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 1074.487875] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.488220] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.488366] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.515197] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.515430] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.515586] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.516010] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.516220] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.516385] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.524607] nova-conductor[52553]: DEBUG nova.quota [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Getting quotas for project 583d131351dd4ef6a6db6ffd061a6a1e. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 1074.526851] nova-conductor[52553]: DEBUG nova.quota [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Getting quotas for user d9589808fcce4507bd6988b2a5119ff9 and project 583d131351dd4ef6a6db6ffd061a6a1e. Resources: {'cores', 'instances', 'ram'} {{(pid=52553) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 1074.532205] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52553) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 1074.532739] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.532944] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.533129] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.535614] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52553) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1074.536243] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.536442] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.536610] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.552539] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1074.552741] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1074.552911] nova-conductor[52553]: DEBUG oslo_concurrency.lockutils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52553) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1105.698170] nova-conductor[52554]: DEBUG nova.db.main.api [None req-0ecb6f04-3d6b-4fb2-8c9c-5c2bb133595f tempest-DeleteServersAdminTestJSON-409634133 tempest-DeleteServersAdminTestJSON-409634133-project-member] Created instance_extra for e84f3fe9-d377-4018-8874-972d1f888208 {{(pid=52554) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1152.392316] nova-conductor[52553]: DEBUG nova.db.main.api [None req-838c9fd1-fd9b-4dda-9994-246915d158d5 tempest-MigrationsAdminTest-2016093575 tempest-MigrationsAdminTest-2016093575-project-member] Created instance_extra for 0d87148b-1493-4777-a8b3-b94a64e8eca6 {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1153.405692] nova-conductor[52553]: DEBUG nova.db.main.api [None req-ef99a399-6c92-4b84-a789-c45fb6281616 tempest-AttachVolumeTestJSON-281399721 tempest-AttachVolumeTestJSON-281399721-project-member] Created instance_extra for f3566a4b-8fe0-4c85-9c45-7c67cfd30323 {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1153.780381] nova-conductor[52553]: DEBUG nova.db.main.api [None req-c7fbca0e-03d3-4e50-a6be-f7c68dcb4b8d tempest-AttachInterfacesTestJSON-1338008180 tempest-AttachInterfacesTestJSON-1338008180-project-member] Created instance_extra for c0f7ff03-5203-418d-aa9e-420448e9dbfb {{(pid=52553) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1201.216080] nova-conductor[52554]: ERROR nova.scheduler.utils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1201.216429] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Rescheduling: True {{(pid=52554) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1201.216604] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb. [ 1201.216818] nova-conductor[52554]: WARNING nova.scheduler.utils [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4754c01f-d312-4b2a-af5a-a34c5bcb42eb. [ 1201.235492] nova-conductor[52554]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] deallocate_for_instance() {{(pid=52554) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1201.254214] nova-conductor[52554]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Instance cache missing network info. {{(pid=52554) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1201.257202] nova-conductor[52554]: DEBUG nova.network.neutron [None req-e3b90cce-9493-4211-a984-92014de3d208 tempest-ServerAddressesNegativeTestJSON-2061325312 tempest-ServerAddressesNegativeTestJSON-2061325312-project-member] [instance: 4754c01f-d312-4b2a-af5a-a34c5bcb42eb] Updating instance_info_cache with network_info: [] {{(pid=52554) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1206.680371] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=52554) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 1206.691531] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.691798] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.691974] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1206.719670] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.719936] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.720138] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1206.720525] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.720716] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.720880] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1206.728656] nova-conductor[52554]: DEBUG nova.quota [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Getting quotas for project c73de74d24b547d686045b7848f07007. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 1206.730946] nova-conductor[52554]: DEBUG nova.quota [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Getting quotas for user 549edeef8bbc465c941651c9e0523f27 and project c73de74d24b547d686045b7848f07007. Resources: {'cores', 'instances', 'ram'} {{(pid=52554) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 1206.736254] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52554) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 1206.736695] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.736894] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.737074] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1206.739587] nova-conductor[52554]: DEBUG nova.conductor.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='856e89ba-b7a4-4a81-ad9d-2997fe327c0c',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52554) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1206.740226] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.740427] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.740593] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1206.754419] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Acquiring lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1206.754621] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1206.754793] nova-conductor[52554]: DEBUG oslo_concurrency.lockutils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Lock "e9ed2ce3-0d39-42ad-aaef-fb9d005eeabe" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52554) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1257.897267] nova-conductor[52553]: ERROR nova.scheduler.utils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 2d7dbbc6-07b5-4f4c-8098-d190fabc545b was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1257.897964] nova-conductor[52553]: DEBUG nova.conductor.manager [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Rescheduling: True {{(pid=52553) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1257.898241] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2d7dbbc6-07b5-4f4c-8098-d190fabc545b.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2d7dbbc6-07b5-4f4c-8098-d190fabc545b. [ 1257.898550] nova-conductor[52553]: WARNING nova.scheduler.utils [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2d7dbbc6-07b5-4f4c-8098-d190fabc545b. [ 1257.919947] nova-conductor[52553]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] deallocate_for_instance() {{(pid=52553) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1257.937318] nova-conductor[52553]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Instance cache missing network info. {{(pid=52553) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1257.940739] nova-conductor[52553]: DEBUG nova.network.neutron [None req-b1b1b82c-5b1e-4d27-b73d-ae9ccb92cc5c tempest-ServerRescueTestJSONUnderV235-24253888 tempest-ServerRescueTestJSONUnderV235-24253888-project-member] [instance: 2d7dbbc6-07b5-4f4c-8098-d190fabc545b] Updating instance_info_cache with network_info: [] {{(pid=52553) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}}