[ 418.253686] nova-conductor[51914]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 419.469915] nova-conductor[51914]: DEBUG oslo_db.sqlalchemy.engines [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51914) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 419.496257] nova-conductor[51914]: DEBUG nova.context [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),e3885d83-7df6-4250-a13b-6a1c0495dd3b(cell1) {{(pid=51914) load_cells /opt/stack/nova/nova/context.py:464}} [ 419.498109] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51914) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 419.498326] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51914) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 419.498825] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=51914) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 419.499186] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51914) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 419.499367] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51914) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 419.500287] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=51914) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 419.505542] nova-conductor[51914]: DEBUG oslo_db.sqlalchemy.engines [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51914) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 419.505916] nova-conductor[51914]: DEBUG oslo_db.sqlalchemy.engines [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51914) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 419.567491] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Acquiring lock "singleton_lock" {{(pid=51914) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 419.567698] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Acquired lock "singleton_lock" {{(pid=51914) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 419.567893] nova-conductor[51914]: DEBUG oslo_concurrency.lockutils [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Releasing lock "singleton_lock" {{(pid=51914) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 419.568340] nova-conductor[51914]: INFO oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Starting 2 workers [ 419.572930] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Started child 52625 {{(pid=51914) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 419.576253] nova-conductor[52625]: INFO nova.service [-] Starting conductor node (version 0.0.1) [ 419.577389] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Started child 52626 {{(pid=51914) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 419.578033] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Full set of CONF: {{(pid=51914) wait /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:649}} [ 419.578271] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ******************************************************************************** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 419.578434] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] Configuration options gathered from: {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 419.578608] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] command line args: ['--config-file', '/etc/nova/nova.conf'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 419.578946] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] config files: ['/etc/nova/nova.conf'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 419.579098] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ================================================================================ {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 419.579509] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] allow_resize_to_same_host = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.579879] nova-conductor[52626]: INFO nova.service [-] Starting conductor node (version 0.0.1) [ 419.580099] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] arq_binding_timeout = 300 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.580317] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] block_device_allocate_retries = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.580534] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] block_device_allocate_retries_interval = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.580750] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cert = self.pem {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.580936] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute_driver = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.581182] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute_monitors = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.581442] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] config_dir = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.581676] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] config_drive_format = iso9660 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.581834] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] config_file = ['/etc/nova/nova.conf'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.582043] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] config_source = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.582250] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] console_host = devstack {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.582451] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] control_exchange = nova {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.582630] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cpu_allocation_ratio = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.582804] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] daemon = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.583013] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] debug = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.583197] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] default_access_ip_network_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.583437] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] default_availability_zone = nova {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.583633] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] default_ephemeral_format = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.583927] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.584144] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] default_schedule_zone = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.584369] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] disk_allocation_ratio = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.584471] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] enable_new_services = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.584700] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] enabled_apis = ['osapi_compute'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.584882] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] enabled_ssl_apis = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.585142] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] flat_injected = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.585341] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] force_config_drive = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.585533] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] force_raw_images = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.585730] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] graceful_shutdown_timeout = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.585908] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] heal_instance_info_cache_interval = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.586370] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] host = devstack {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.586604] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] initial_cpu_allocation_ratio = 4.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.586941] nova-conductor[52625]: DEBUG oslo_db.sqlalchemy.engines [None req-618b9da9-f642-43e5-93d7-100dcc3d93d0 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52625) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 419.587125] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] initial_disk_allocation_ratio = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.587318] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] initial_ram_allocation_ratio = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.587577] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.587759] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instance_build_timeout = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.587917] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instance_delete_interval = 300 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.588093] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instance_format = [instance: %(uuid)s] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.588258] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instance_name_template = instance-%08x {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.588416] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instance_usage_audit = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.588615] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instance_usage_audit_period = month {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.588811] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.589015] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] instances_path = /opt/stack/data/nova/instances {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.589194] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] internal_service_availability_zone = internal {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.589366] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] key = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.589546] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] live_migration_retry_count = 30 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.589745] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_config_append = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.589923] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.590114] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_dir = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.590293] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.590449] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_options = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.590630] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_rotate_interval = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.590833] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_rotate_interval_type = days {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.591021] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] log_rotation_type = none {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.591218] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.591362] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.591539] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.591722] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.591847] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.592056] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] long_rpc_timeout = 1800 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.592289] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] max_concurrent_builds = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.592460] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] max_concurrent_live_migrations = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.592647] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] max_concurrent_snapshots = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.592809] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] max_local_block_devices = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.593144] nova-conductor[52626]: DEBUG oslo_db.sqlalchemy.engines [None req-32ad772a-db9e-4202-b44a-30111a6e9be3 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52626) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 419.593289] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] max_logfile_count = 30 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.593493] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] max_logfile_size_mb = 200 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.593651] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] maximum_instance_delete_attempts = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.593867] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] metadata_listen = 0.0.0.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.594126] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] metadata_listen_port = 8775 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.594321] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] metadata_workers = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.594488] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] migrate_max_retries = -1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.594649] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] mkisofs_cmd = genisoimage {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.594856] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] my_block_storage_ip = 10.180.1.21 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.594984] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] my_ip = 10.180.1.21 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.595158] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] network_allocate_retries = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.595331] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.595520] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] osapi_compute_listen = 0.0.0.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.595730] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] osapi_compute_listen_port = 8774 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.595907] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] osapi_compute_unique_server_name_scope = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.596084] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] osapi_compute_workers = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.596364] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] password_length = 12 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.596562] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] periodic_enable = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.596714] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] periodic_fuzzy_delay = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.596876] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] pointer_model = usbtablet {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.597067] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] preallocate_images = none {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.597274] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] publish_errors = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.597419] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] pybasedir = /opt/stack/nova {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.597597] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ram_allocation_ratio = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.597757] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rate_limit_burst = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.597917] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rate_limit_except_level = CRITICAL {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.598080] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rate_limit_interval = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.598237] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] reboot_timeout = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.598393] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] reclaim_instance_interval = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.598598] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] record = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.598786] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] reimage_timeout_per_gb = 20 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.598987] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] report_interval = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.599184] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rescue_timeout = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.599347] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] reserved_host_cpus = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.599526] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] reserved_host_disk_mb = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.599692] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] reserved_host_memory_mb = 512 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.599859] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] reserved_huge_pages = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.600047] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] resize_confirm_window = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.600228] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] resize_fs_using_block_device = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.600387] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] resume_guests_state_on_host_boot = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.600559] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.600807] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rpc_response_timeout = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.601015] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] run_external_periodic_tasks = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.601201] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] running_deleted_instance_action = reap {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.601375] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] running_deleted_instance_poll_interval = 1800 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.601578] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] running_deleted_instance_timeout = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.601736] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler_instance_sync_interval = 120 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.601907] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_down_time = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.602112] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] servicegroup_driver = db {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.602279] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] shelved_offload_time = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.602478] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] shelved_poll_interval = 3600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.602702] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] shutdown_timeout = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.602872] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] source_is_ipv6 = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.603041] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ssl_only = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.603210] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] state_path = /opt/stack/data/nova {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.603389] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] sync_power_state_interval = 600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.603547] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] sync_power_state_pool_size = 1000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.603710] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] syslog_log_facility = LOG_USER {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.603860] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] tempdir = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.604017] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] timeout_nbd = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.604227] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] transport_url = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.604392] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] update_resources_interval = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.604550] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] use_cow_images = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.604730] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] use_eventlog = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.604915] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] use_journal = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.605084] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] use_json = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.605239] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] use_rootwrap_daemon = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.605437] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] use_stderr = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.605557] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] use_syslog = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.605733] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vcpu_pin_set = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.605920] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vif_plugging_is_fatal = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.606112] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vif_plugging_timeout = 300 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.606338] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] virt_mkfs = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.606510] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] volume_usage_poll_interval = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.606665] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] watch_log_file = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.606861] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] web = /usr/share/spice-html5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 419.607153] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_concurrency.disable_process_locking = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.607434] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.607676] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.607851] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.608030] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_metrics.metrics_process_name = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.608202] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.608363] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.608590] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.auth_strategy = keystone {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.608779] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.compute_link_prefix = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.608970] nova-conductor[52625]: DEBUG nova.service [None req-618b9da9-f642-43e5-93d7-100dcc3d93d0 None None] Creating RPC server for service conductor {{(pid=52625) start /opt/stack/nova/nova/service.py:182}} [ 419.609175] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.609384] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.dhcp_domain = novalocal {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.609561] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.enable_instance_password = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.609724] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.glance_link_prefix = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.609884] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.610094] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.instance_list_cells_batch_strategy = distributed {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.610262] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.instance_list_per_project_cells = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.610447] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.list_records_by_skipping_down_cells = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.610624] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.local_metadata_per_cell = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.610823] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.max_limit = 1000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.611021] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.metadata_cache_expiration = 15 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.611192] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.neutron_default_tenant_id = default {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.611356] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.use_forwarded_for = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.611519] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.use_neutron_default_nets = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.611688] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.611846] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.vendordata_dynamic_failure_fatal = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.612010] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.612190] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.vendordata_dynamic_ssl_certfile = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.612364] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.vendordata_dynamic_targets = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.612543] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.vendordata_jsonfile_path = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.612729] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api.vendordata_providers = ['StaticJSON'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.612972] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.backend = dogpile.cache.memcached {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.613208] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.backend_argument = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.613418] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.config_prefix = cache.oslo {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.613618] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.dead_timeout = 60.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.613776] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.debug_cache_backend = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.613935] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.enable_retry_client = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.614137] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.enable_socket_keepalive = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.614348] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.enabled = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.614533] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.expiration_time = 600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.614695] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.hashclient_retry_attempts = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.614853] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.hashclient_retry_delay = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.615025] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_dead_retry = 300 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.615214] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_password = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.615428] nova-conductor[52626]: DEBUG nova.service [None req-32ad772a-db9e-4202-b44a-30111a6e9be3 None None] Creating RPC server for service conductor {{(pid=52626) start /opt/stack/nova/nova/service.py:182}} [ 419.615582] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.615814] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.616035] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_pool_maxsize = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.616219] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_pool_unused_timeout = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.616405] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_sasl_enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.616588] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_servers = ['localhost:11211'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.616779] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_socket_timeout = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.616923] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.memcache_username = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.617105] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.proxies = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.617273] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.retry_attempts = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.617438] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.retry_delay = 0.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.617624] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.socket_keepalive_count = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.617800] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.socket_keepalive_idle = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.617959] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.socket_keepalive_interval = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.618129] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.tls_allowed_ciphers = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.618307] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.tls_cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.618467] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.tls_certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.618621] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.tls_enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.618770] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cache.tls_keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.618976] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.619193] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.auth_type = password {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.619399] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.619603] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.catalog_info = volumev3::publicURL {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.619771] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.619948] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.620139] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.cross_az_attach = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.620303] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.debug = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.620457] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.endpoint_template = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.620633] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.http_retries = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.620794] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.620952] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.621216] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.os_region_name = RegionOne {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.621788] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.621788] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cinder.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.621788] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.621932] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.cpu_dedicated_set = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.622029] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.cpu_shared_set = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.622341] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.image_type_exclude_list = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.622341] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.live_migration_wait_for_vif_plug = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.622836] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.max_concurrent_disk_ops = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.622836] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.max_disk_devices_to_attach = -1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.622836] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.623018] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.623185] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.resource_provider_association_refresh = 300 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.623355] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.shutdown_retry_interval = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.623532] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.623701] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] conductor.workers = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.623877] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] console.allowed_origins = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.624045] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] console.ssl_ciphers = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.624230] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] console.ssl_minimum_version = default {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.624444] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] consoleauth.token_ttl = 600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.624650] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.624812] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.625375] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.625375] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.connect_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.625375] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.connect_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.625507] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.endpoint_override = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.625587] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.625743] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.625914] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.max_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.626089] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.min_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.626313] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.region_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.626609] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.service_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.626687] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.service_type = accelerator {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.626834] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.626984] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.status_code_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.627150] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.status_code_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.627336] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.627523] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.627692] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] cyborg.version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.628305] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.backend = sqlalchemy {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.628305] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.connection = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.628305] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.connection_debug = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.628426] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.connection_parameters = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.628579] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.connection_recycle_time = 3600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.628736] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.connection_trace = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.628940] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.db_inc_retry_interval = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.629157] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.db_max_retries = 20 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.629395] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.db_max_retry_interval = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.629445] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.db_retry_interval = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.629834] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.max_overflow = 50 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.629834] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.max_pool_size = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.629972] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.max_retries = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.630116] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.mysql_enable_ndb = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.630462] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.mysql_sql_mode = TRADITIONAL {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.630462] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.mysql_wsrep_sync_wait = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.630628] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.pool_timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.630792] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.retry_interval = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.630978] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.slave_connection = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.631114] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.sqlite_synchronous = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.631287] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] database.use_db_reconnect = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.631494] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.backend = sqlalchemy {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.631757] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.connection = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.631825] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.connection_debug = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.631978] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.connection_parameters = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.632553] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.connection_recycle_time = 3600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.632553] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.connection_trace = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.632553] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.db_inc_retry_interval = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.632652] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.db_max_retries = 20 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.632766] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.db_max_retry_interval = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.632942] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.db_retry_interval = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.633139] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.max_overflow = 50 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.633359] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.max_pool_size = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.633477] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.max_retries = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634172] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.mysql_enable_ndb = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634172] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634172] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.mysql_wsrep_sync_wait = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634172] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.pool_timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634331] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.retry_interval = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634419] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.slave_connection = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634601] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] api_database.sqlite_synchronous = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.634835] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] devices.enabled_mdev_types = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.635032] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.635213] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ephemeral_storage_encryption.enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.635386] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ephemeral_storage_encryption.key_size = 512 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.635634] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.api_servers = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.635735] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.635908] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.636072] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.636376] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.connect_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.636438] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.connect_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.636611] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.debug = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.636927] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.default_trusted_certificate_ids = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.637016] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.enable_certificate_validation = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.637135] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.enable_rbd_download = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.637483] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.endpoint_override = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.637635] nova-conductor[52625]: DEBUG nova.service [None req-618b9da9-f642-43e5-93d7-100dcc3d93d0 None None] Join ServiceGroup membership for this service conductor {{(pid=52625) start /opt/stack/nova/nova/service.py:199}} [ 419.637785] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.638065] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.638191] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.max_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.638378] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.min_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.638543] nova-conductor[52625]: DEBUG nova.servicegroup.drivers.db [None req-618b9da9-f642-43e5-93d7-100dcc3d93d0 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52625) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 419.638735] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.num_retries = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.638939] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.rbd_ceph_conf = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.639097] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.rbd_connect_timeout = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.639266] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.rbd_pool = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.639432] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.rbd_user = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.639580] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.region_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.639801] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.service_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.640032] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.service_type = image {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.640181] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.640498] nova-conductor[52626]: DEBUG nova.service [None req-32ad772a-db9e-4202-b44a-30111a6e9be3 None None] Join ServiceGroup membership for this service conductor {{(pid=52626) start /opt/stack/nova/nova/service.py:199}} [ 419.640640] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.status_code_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.640817] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.status_code_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.641017] nova-conductor[52626]: DEBUG nova.servicegroup.drivers.db [None req-32ad772a-db9e-4202-b44a-30111a6e9be3 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52626) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 419.641149] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.641349] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.641549] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.verify_glance_signatures = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.641723] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] glance.version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.641889] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] guestfs.debug = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.642101] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.config_drive_cdrom = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.642267] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.config_drive_inject_password = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.642438] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.642595] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.enable_instance_metrics_collection = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.642754] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.enable_remotefx = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.642919] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.instances_path_share = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.643095] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.iscsi_initiator_list = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.643289] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.limit_cpu_features = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.643458] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.643621] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.643779] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.power_state_check_timeframe = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.643934] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.power_state_event_polling_interval = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.644128] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.644291] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.use_multipath_io = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.644481] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.volume_attach_retry_count = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.644642] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.volume_attach_retry_interval = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.644828] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.vswitch_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.645016] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.645198] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] mks.enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.645786] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.645994] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] image_cache.manager_interval = 2400 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.646182] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] image_cache.precache_concurrency = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.646404] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] image_cache.remove_unused_base_images = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.646612] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.646814] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.646996] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] image_cache.subdirectory_name = _base {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.647211] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.api_max_retries = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.647377] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.api_retry_interval = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.647553] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.647729] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.auth_type = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.647889] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.648067] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.648319] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.648501] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.connect_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.648662] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.connect_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.648820] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.endpoint_override = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.648978] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.649155] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.649319] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.max_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.649482] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.min_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.649649] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.partition_key = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.649811] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.peer_list = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.649988] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.region_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.650173] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.serial_console_state_timeout = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.650326] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.service_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.650509] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.service_type = baremetal {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.650668] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.650823] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.status_code_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.650980] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.status_code_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.651147] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.651322] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.651483] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ironic.version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.651698] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.651894] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] key_manager.fixed_key = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.652121] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.652301] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.barbican_api_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.652478] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.barbican_endpoint = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.652665] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.barbican_endpoint_type = public {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.652840] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.barbican_region_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.653017] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.653192] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.653380] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.653583] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.653740] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.653900] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.number_of_retries = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.654073] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.retry_delay = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.654259] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.send_service_user_token = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.654416] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.654568] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.654721] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.verify_ssl = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.654880] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican.verify_ssl_path = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.655104] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.655271] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.auth_type = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.655427] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.655577] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.655733] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.655885] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.656046] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.656205] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.656359] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] barbican_service_user.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.656553] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.approle_role_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.656771] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.approle_secret_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.656957] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.657137] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.657306] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.657463] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.657618] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.657802] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.kv_mountpoint = secret {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.657958] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.kv_version = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.658133] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.namespace = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.658287] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.root_token_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.658450] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.658651] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.ssl_ca_crt_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.658810] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.658967] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.use_ssl = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.659163] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.659359] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.659519] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.659677] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.659829] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.connect_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.659984] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.connect_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.660153] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.endpoint_override = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.660363] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.660525] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.660675] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.max_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.660822] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.min_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.660971] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.region_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.661134] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.service_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.661296] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.service_type = identity {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.661451] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.661601] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.status_code_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.661751] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.status_code_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.661941] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.662148] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.662308] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] keystone.version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.662547] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.connection_uri = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.662728] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.cpu_mode = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.662891] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.cpu_model_extra_flags = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.663067] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.cpu_models = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.663236] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.cpu_power_governor_high = performance {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.663401] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.cpu_power_governor_low = powersave {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.663596] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.cpu_power_management = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.663796] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.663976] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.device_detach_attempts = 8 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.664169] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.device_detach_timeout = 20 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.664333] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.disk_cachemodes = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.664488] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.disk_prefix = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.664648] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.enabled_perf_events = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.664803] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.file_backed_memory = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.664962] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.gid_maps = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.665178] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.hw_disk_discard = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.665379] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.hw_machine_type = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.665555] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.images_rbd_ceph_conf = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.665716] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.665879] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.666057] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.images_rbd_glance_store_name = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.666228] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.images_rbd_pool = rbd {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.666408] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.images_type = default {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.666564] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.images_volume_group = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.666723] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.inject_key = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.666911] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.inject_partition = -2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.667096] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.inject_password = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.667278] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.iscsi_iface = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.667440] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.iser_use_multipath = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.667599] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_bandwidth = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.667757] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_completion_timeout = 800 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.667911] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_downtime = 500 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.668079] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_downtime_delay = 75 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.668239] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_downtime_steps = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.668406] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_inbound_addr = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.668590] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_permit_auto_converge = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.668757] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_permit_post_copy = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.668915] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_scheme = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.669095] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_timeout_action = abort {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.669258] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_tunnelled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.669415] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_uri = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.669575] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.live_migration_with_native_tls = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.669752] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.max_queues = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.669914] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.mem_stats_period_seconds = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.670136] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.nfs_mount_options = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.670500] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.670685] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.num_aoe_discover_tries = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.670850] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.num_iser_scan_tries = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.671015] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.num_memory_encrypted_guests = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.671182] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.num_nvme_discover_tries = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.671340] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.num_pcie_ports = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.671501] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.num_volume_scan_tries = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.671715] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.pmem_namespaces = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.671875] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.quobyte_client_cfg = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.672110] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.672274] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rbd_connect_timeout = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.672435] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.672591] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.672743] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rbd_secret_uuid = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.672894] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rbd_user = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.673063] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.realtime_scheduler_priority = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.673276] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.remote_filesystem_transport = ssh {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.673446] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rescue_image_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.673601] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rescue_kernel_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.673755] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rescue_ramdisk_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.673918] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rng_dev_path = /dev/urandom {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.674086] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.rx_queue_size = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.674259] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.smbfs_mount_options = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.674486] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.674647] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.snapshot_compression = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.674819] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.snapshot_image_format = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.675063] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.675232] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.sparse_logical_volumes = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.675391] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.swtpm_enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.675562] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.swtpm_group = tss {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.675723] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.swtpm_user = tss {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.675890] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.sysinfo_serial = unique {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.676051] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.tx_queue_size = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.676218] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.uid_maps = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.676395] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.use_virtio_for_bridges = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.676580] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.virt_type = kvm {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.676747] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.volume_clear = zero {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.676908] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.volume_clear_size = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.677079] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.volume_use_multipath = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.677236] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.vzstorage_cache_path = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.677399] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.677562] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.vzstorage_mount_group = qemu {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.677719] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.vzstorage_mount_opts = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.677879] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.678128] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.678322] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.vzstorage_mount_user = stack {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.678493] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.678685] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.678856] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.auth_type = password {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.679021] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.679181] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.679362] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.679522] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.connect_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.679682] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.connect_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.679895] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.default_floating_pool = public {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.680070] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.endpoint_override = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.680233] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.extension_sync_interval = 600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.680390] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.http_retries = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.680550] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.680703] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.680861] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.max_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.681034] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.metadata_proxy_shared_secret = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.681190] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.min_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.681361] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.ovs_bridge = br-int {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.681559] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.physnets = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.681737] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.region_name = RegionOne {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.681904] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.service_metadata_proxy = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.682073] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.service_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.682246] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.service_type = network {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.682407] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.682558] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.status_code_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.682715] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.status_code_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.682871] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.683087] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.683299] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] neutron.version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.683485] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] notifications.bdms_in_notifications = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.683663] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] notifications.default_level = INFO {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.683834] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] notifications.notification_format = unversioned {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.683994] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] notifications.notify_on_state_change = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.684183] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.684361] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] pci.alias = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.684530] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] pci.device_spec = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.684693] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] pci.report_in_placement = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.684916] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.685105] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.auth_type = password {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.685277] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.auth_url = http://10.180.1.21/identity {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.685434] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.685585] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.685761] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.685914] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.connect_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.686076] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.connect_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.686249] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.default_domain_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.686400] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.default_domain_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.686588] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.domain_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.686748] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.domain_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.686900] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.endpoint_override = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.687069] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.687225] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.687376] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.max_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.687528] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.min_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.687688] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.password = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.687838] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.project_domain_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.687998] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.project_domain_name = Default {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.688206] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.project_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.688388] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.project_name = service {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.688558] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.region_name = RegionOne {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.688712] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.service_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.688878] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.service_type = placement {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.689047] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.689204] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.status_code_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.689355] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.status_code_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.689508] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.system_scope = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.689657] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.689817] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.trust_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.690008] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.user_domain_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.690180] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.user_domain_name = Default {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.690333] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.user_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.690498] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.username = placement {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.690676] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.690829] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] placement.version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.690996] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.cores = 20 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.691172] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.count_usage_from_placement = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.691337] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.691569] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.injected_file_content_bytes = 10240 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.691771] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.injected_file_path_length = 255 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.691940] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.injected_files = 5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.692120] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.instances = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.692283] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.key_pairs = 100 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.692446] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.metadata_items = 128 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.692608] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.ram = 51200 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.692764] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.recheck_quota = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.692923] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.server_group_members = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.693090] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] quota.server_groups = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.693283] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rdp.enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.693598] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.693807] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.693992] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.694170] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.image_metadata_prefilter = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.694350] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.694525] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.max_attempts = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.694683] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.max_placement_results = 1000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.694858] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.695052] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.query_placement_for_availability_zone = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.695244] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.query_placement_for_image_type_support = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.695416] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.695612] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] scheduler.workers = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.695807] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.695979] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.696189] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.696430] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.696636] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.696810] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.696969] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.697191] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.697360] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.host_subset_size = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.697515] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.image_properties_default_architecture = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.697673] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.697833] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.isolated_hosts = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.698015] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.isolated_images = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.698191] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.max_instances_per_host = 50 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.698383] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.698550] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.pci_in_placement = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.698789] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.698953] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.699127] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.699285] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.699445] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.699608] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.699787] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.track_instance_changes = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.700050] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.700247] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] metrics.required = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.700415] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] metrics.weight_multiplier = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.700575] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] metrics.weight_of_unavailable = -10000.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.700758] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] metrics.weight_setting = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.701068] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.701250] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] serial_console.enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.701443] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] serial_console.port_range = 10000:20000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.701637] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.701814] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.701981] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] serial_console.serialproxy_port = 6083 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.702156] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.702324] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.auth_type = password {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.702480] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.702631] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.702786] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.702939] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.703106] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.703309] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.send_service_user_token = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.703477] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.703629] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] service_user.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.703792] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.agent_enabled = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.703970] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.704326] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.704576] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.html5proxy_host = 0.0.0.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.704752] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.html5proxy_port = 6082 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.704942] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.image_compression = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.705127] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.jpeg_compression = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.705287] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.playback_compression = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.705474] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.server_listen = 127.0.0.1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.705660] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.705814] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.streaming_mode = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.705968] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] spice.zlib_compression = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.706149] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] upgrade_levels.baseapi = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.706309] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] upgrade_levels.cert = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.706502] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] upgrade_levels.compute = auto {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.706687] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] upgrade_levels.conductor = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.706844] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] upgrade_levels.scheduler = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.707010] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.707176] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.auth_type = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.707334] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.707488] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.707643] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.707796] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.707946] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.708127] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.708348] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vendordata_dynamic_auth.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.708575] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.api_retry_count = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.708736] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.ca_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.708888] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.cache_prefix = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.709052] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.cluster_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.709212] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.connection_pool_size = 10 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.709362] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.console_delay_seconds = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.709518] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.datastore_regex = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.709669] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.host_ip = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.709857] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.host_password = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.710044] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.host_port = 443 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.710208] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.host_username = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.710364] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.710518] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.integration_bridge = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.710674] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.maximum_objects = 100 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.710824] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.pbm_default_policy = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.710978] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.pbm_enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.711143] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.pbm_wsdl_location = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.711306] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.711469] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.serial_port_proxy_uri = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.711651] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.serial_port_service_uri = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.711817] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.task_poll_interval = 0.5 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.711978] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.use_linked_clone = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.712156] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.vnc_keymap = en-us {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.712315] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.vnc_port = 5900 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.712472] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vmware.vnc_port_total = 10000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.712927] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.auth_schemes = ['none'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.713163] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.enabled = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.713497] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.713687] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.713855] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.novncproxy_port = 6080 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.714041] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.server_listen = 127.0.0.1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.714217] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.714378] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.vencrypt_ca_certs = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.714533] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.vencrypt_client_cert = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.714682] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] vnc.vencrypt_client_key = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.714916] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.715107] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.disable_deep_image_inspection = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.715270] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.disable_fallback_pcpu_query = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.715429] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.disable_group_policy_check_upcall = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.715586] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.715739] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.disable_rootwrap = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.715892] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.enable_numa_live_migration = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.716170] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.716220] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.716385] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.handle_virt_lifecycle_events = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.716585] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.libvirt_disable_apic = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.716792] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.never_download_image_if_on_rbd = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.716959] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.717131] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.717297] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.717455] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.717608] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.717764] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.717916] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.718080] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.718241] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.718472] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.718646] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.client_socket_timeout = 900 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.718804] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.default_pool_size = 1000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.718965] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.keep_alive = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.719140] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.max_header_line = 16384 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.719297] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.secure_proxy_ssl_header = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.719456] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.ssl_ca_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.719609] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.ssl_cert_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.719760] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.ssl_key_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.719919] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.tcp_keepidle = 600 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.720138] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.720308] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] zvm.ca_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.720466] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] zvm.cloud_connector_url = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.720689] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.720852] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] zvm.reachable_timeout = 300 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.721082] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.enforce_new_defaults = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.721279] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.enforce_scope = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.721469] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.policy_default_rule = default {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.721697] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.721902] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.policy_file = policy.yaml {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.722107] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.722284] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.722442] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.722593] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.722766] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.722953] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.723145] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.723364] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.connection_string = messaging:// {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.723539] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.enabled = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.723725] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.es_doc_type = notification {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.723899] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.es_scroll_size = 10000 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.724073] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.es_scroll_time = 2m {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.724241] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.filter_error_trace = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.724404] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.hmac_keys = SECRET_KEY {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.724565] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.sentinel_service_name = mymaster {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.724751] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.socket_timeout = 0.1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.724912] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] profiler.trace_sqlalchemy = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.725140] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] remote_debug.host = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.725319] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] remote_debug.port = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.725504] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.725663] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.725819] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.725974] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.726145] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.726306] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.726463] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.726648] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.726824] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.726980] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.727165] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.727329] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.727501] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.727663] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.727820] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.727990] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.728158] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.728322] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.728483] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.728639] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.728789] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.728999] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.729180] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.729342] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.729502] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.729666] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.ssl = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.729855] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.730049] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.730215] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.730381] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.730547] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_rabbit.ssl_version = {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.730757] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.730919] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_notifications.retry = -1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.731106] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.731276] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_messaging_notifications.transport_url = **** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.731476] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.auth_section = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.731641] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.auth_type = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.731818] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.cafile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.731989] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.certfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.732168] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.collect_timing = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.732326] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.connect_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.732503] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.connect_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.732658] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.endpoint_id = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.732825] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.endpoint_override = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.732984] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.insecure = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.733190] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.keyfile = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.733352] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.max_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.733506] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.min_version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.733676] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.region_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.733824] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.service_name = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.733971] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.service_type = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.734140] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.split_loggers = False {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.734292] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.status_code_retries = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.734445] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.status_code_retry_delay = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.734594] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.timeout = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.734753] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.valid_interfaces = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.734905] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_limit.version = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.735125] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_reports.file_event_handler = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.738346] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_reports.file_event_handler_interval = 1 {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.738346] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] oslo_reports.log_dir = None {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 419.738346] nova-conductor[51914]: DEBUG oslo_service.service [None req-3df27cf0-551b-4ef0-a9de-933471838fdc None None] ******************************************************************************** {{(pid=51914) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 506.460620] nova-conductor[52626]: DEBUG oslo_db.sqlalchemy.engines [None req-0e0fcf11-36b8-43cb-852f-c1ca89f96703 None None] Parent process 51914 forked (52626) with an open database connection, which is being discarded and recreated. {{(pid=52626) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 552.518696] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Took 0.40 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 552.551905] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.552267] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.554944] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.559975] nova-conductor[52626]: DEBUG oslo_db.sqlalchemy.engines [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52626) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 552.639234] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.639469] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.639982] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.640383] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.640654] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.640823] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.649622] nova-conductor[52626]: DEBUG oslo_db.sqlalchemy.engines [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52626) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 552.662674] nova-conductor[52625]: DEBUG oslo_db.sqlalchemy.engines [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Parent process 51914 forked (52625) with an open database connection, which is being discarded and recreated. {{(pid=52625) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 552.666782] nova-conductor[52626]: DEBUG nova.quota [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Getting quotas for project c11194ae93ad4f41bf4a8e89106dfa44. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 552.670201] nova-conductor[52626]: DEBUG nova.quota [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Getting quotas for user 42b0436ba3d44dfc9236c3db7fc1ee9d and project c11194ae93ad4f41bf4a8e89106dfa44. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 552.675087] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 552.675748] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.676095] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.676181] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.680831] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 552.681526] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.681734] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.681910] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.709529] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.709529] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.709529] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.709529] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52626) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 552.709723] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52626) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 552.709723] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.709723] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.709723] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.709723] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.709867] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.709867] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.717231] nova-conductor[52626]: INFO nova.compute.rpcapi [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 552.717320] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-d23357ce-2224-42d9-a7a1-36a85c5ba098 None None] Releasing lock "compute-rpcapi-router" {{(pid=52626) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 552.914038] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Took 0.24 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 552.942913] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.945020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.945020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 552.957893] nova-conductor[52625]: DEBUG oslo_db.sqlalchemy.engines [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52625) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 553.032109] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.032394] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.032902] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.033357] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.033487] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.035603] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.042830] nova-conductor[52625]: DEBUG oslo_db.sqlalchemy.engines [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52625) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 553.060692] nova-conductor[52625]: DEBUG nova.quota [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Getting quotas for project d02a18280b9642539083abb609f328d5. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 553.064687] nova-conductor[52625]: DEBUG nova.quota [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Getting quotas for user 0c1c3b48259043c68eb019adafc1a116 and project d02a18280b9642539083abb609f328d5. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 553.070365] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 553.071025] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.072235] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.072235] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.076027] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 553.076508] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.076738] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.076917] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.137458] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.137740] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.137950] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.138277] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52625) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 553.138423] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52625) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 553.139128] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.139210] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.139374] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.139711] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.139886] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.140058] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.147060] nova-conductor[52625]: INFO nova.compute.rpcapi [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 553.147780] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d53ae19a-c201-4cbe-b6a9-96d6e7ebdfe6 None None] Releasing lock "compute-rpcapi-router" {{(pid=52625) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 553.264138] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 553.283978] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 553.315096] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.315341] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.315516] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.317875] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.318131] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.318328] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.367323] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.367588] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.367788] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.368278] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.368345] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.368507] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.371229] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.372039] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.372039] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.372039] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.372216] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.372379] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.381026] nova-conductor[52626]: DEBUG nova.quota [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Getting quotas for project 72271ead0a4a44da9b2f69a5062734e2. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 553.381026] nova-conductor[52626]: DEBUG nova.quota [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Getting quotas for user 63fafad766aa42758af1a36008299adb and project 72271ead0a4a44da9b2f69a5062734e2. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 553.386774] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 553.387393] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.387712] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.387805] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.389592] nova-conductor[52625]: DEBUG nova.quota [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Getting quotas for project 502b35c7d9b44881ac0c5052e7783f3d. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 553.391734] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 553.392458] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.392579] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.392805] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.393784] nova-conductor[52625]: DEBUG nova.quota [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Getting quotas for user 51ecb9d39b5e4829a1d68198cb05e5a8 and project 502b35c7d9b44881ac0c5052e7783f3d. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 553.404307] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 553.405167] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.405167] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.405293] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.407353] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.408446] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.408446] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.409031] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 553.409711] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.410323] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.410323] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.432852] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.433100] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.433278] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.214765] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 554.256766] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.257057] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.257257] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.335517] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.335718] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.335918] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.336425] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.336616] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.336843] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.347746] nova-conductor[52625]: DEBUG nova.quota [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Getting quotas for project a2116357188240649088d46f3a6c3c50. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 554.350615] nova-conductor[52625]: DEBUG nova.quota [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Getting quotas for user b037cf8807344a588fb6691968546879 and project a2116357188240649088d46f3a6c3c50. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 554.361398] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 554.361398] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.361398] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.361398] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.364509] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 554.365497] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.365939] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.367156] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.391022] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.391022] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.391022] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.368831] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 555.389472] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.389754] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.389901] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.448892] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.449784] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.450163] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.451301] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.451301] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.451301] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.466076] nova-conductor[52625]: DEBUG nova.quota [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Getting quotas for project ce403fd7ca154238a6c92f219ddf95fc. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 555.469392] nova-conductor[52625]: DEBUG nova.quota [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Getting quotas for user ef369e94b11e4ec987e2455f4232d947 and project ce403fd7ca154238a6c92f219ddf95fc. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 555.481125] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 555.481125] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.481125] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.481296] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.484704] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 555.487615] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.487615] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.487615] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.509080] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.509378] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.509561] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.005330] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 559.019454] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.021310] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.002s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.021526] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.048934] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.049482] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.049746] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.050216] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.050562] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.050784] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.063458] nova-conductor[52626]: DEBUG nova.quota [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Getting quotas for project ed721a0a42ee43fba6f37868594bffec. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 559.066905] nova-conductor[52626]: DEBUG nova.quota [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Getting quotas for user 4930bb1ad0cc4376a388847b3238dded and project ed721a0a42ee43fba6f37868594bffec. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 559.073221] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 559.073829] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.074244] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.075153] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.079683] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 559.082170] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.082170] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.082170] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.094976] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.095540] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.095856] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.381980] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 562.400409] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.400520] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.400726] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.449330] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.449637] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.449855] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.450293] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.450536] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.450738] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.465576] nova-conductor[52626]: DEBUG nova.quota [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Getting quotas for project c984e18184364cb6a11bd2014bc377b3. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 562.468227] nova-conductor[52626]: DEBUG nova.quota [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Getting quotas for user 3736edde826e4b4cabc61b17c223ace6 and project c984e18184364cb6a11bd2014bc377b3. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 562.475127] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 562.475127] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.475353] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.475389] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.479140] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 562.479709] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.479924] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.480551] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.498571] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.498802] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.499018] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.855729] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 563.875736] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.875861] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.879020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.921951] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.922434] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.922549] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.922864] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.923065] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.923236] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.934517] nova-conductor[52625]: DEBUG nova.quota [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Getting quotas for project 0f9d7fe46ca145d3983ce03907f5842c. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 563.937483] nova-conductor[52625]: DEBUG nova.quota [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Getting quotas for user 36d95e8e77d74f019725115e00d59093 and project 0f9d7fe46ca145d3983ce03907f5842c. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 563.957009] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 563.958980] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.959272] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.959454] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.964878] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 563.965603] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.965786] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.966661] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.983842] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.984243] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.984243] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.051211] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 572.065020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.065020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.065020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.097263] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.097668] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.097996] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.100020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.100020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.100020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.111107] nova-conductor[52625]: DEBUG nova.quota [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Getting quotas for project b6a82227811a40cda939f7164f414da2. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 572.113727] nova-conductor[52625]: DEBUG nova.quota [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Getting quotas for user d0925d391e2248eab9b7334e277d5d64 and project b6a82227811a40cda939f7164f414da2. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 572.124311] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 572.124311] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.124311] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.124311] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.127296] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 572.128279] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.128603] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.128921] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.142630] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.145025] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.145025] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.163653] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 572.200709] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.200998] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.201236] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.244870] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.245119] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.245302] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.245652] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.245885] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.246074] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.258681] nova-conductor[52626]: DEBUG nova.quota [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Getting quotas for project 911ada63d0ee4b5f965cb5d251ab5a78. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 572.261644] nova-conductor[52626]: DEBUG nova.quota [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Getting quotas for user 8ec9debbb6274965b14fd444ab31e352 and project 911ada63d0ee4b5f965cb5d251ab5a78. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 572.269741] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 572.270311] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.270520] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.270702] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.273298] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 572.273940] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.274170] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.274343] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.294664] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.294808] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.294963] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 575.888253] nova-conductor[52626]: Traceback (most recent call last): [ 575.888253] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 575.888253] nova-conductor[52626]: return func(*args, **kwargs) [ 575.888253] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 575.888253] nova-conductor[52626]: selections = self._select_destinations( [ 575.888253] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 575.888253] nova-conductor[52626]: selections = self._schedule( [ 575.888253] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 575.888253] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 575.888253] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 575.888253] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 575.888253] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 575.888253] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 575.889077] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.889923] nova-conductor[52626]: ERROR nova.conductor.manager [ 575.903881] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.904208] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.904400] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.980324] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] [instance: a0d4465d-1706-4dfd-8207-8a0547a575d8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 575.981061] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.981267] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.981430] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.986496] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 575.986496] nova-conductor[52626]: Traceback (most recent call last): [ 575.986496] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 575.986496] nova-conductor[52626]: return func(*args, **kwargs) [ 575.986496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 575.986496] nova-conductor[52626]: selections = self._select_destinations( [ 575.986496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 575.986496] nova-conductor[52626]: selections = self._schedule( [ 575.986496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 575.986496] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 575.986496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 575.986496] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 575.986496] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 575.986496] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 575.987243] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-303ca500-a252-4897-ace1-5cf6c7dddf4f tempest-ServersAdmin275Test-788199302 tempest-ServersAdmin275Test-788199302-project-member] [instance: a0d4465d-1706-4dfd-8207-8a0547a575d8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.174765] nova-conductor[52625]: Traceback (most recent call last): [ 577.174765] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 577.174765] nova-conductor[52625]: return func(*args, **kwargs) [ 577.174765] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 577.174765] nova-conductor[52625]: selections = self._select_destinations( [ 577.174765] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 577.174765] nova-conductor[52625]: selections = self._schedule( [ 577.174765] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 577.174765] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 577.174765] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 577.174765] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 577.174765] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 577.174765] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 577.177077] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.178832] nova-conductor[52625]: ERROR nova.conductor.manager [ 577.185764] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.185764] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.185764] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.280371] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] [instance: 382a0fd9-2011-4876-9a0d-98339803d5c5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 577.280899] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.281162] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.281329] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.294241] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 577.294241] nova-conductor[52625]: Traceback (most recent call last): [ 577.294241] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 577.294241] nova-conductor[52625]: return func(*args, **kwargs) [ 577.294241] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 577.294241] nova-conductor[52625]: selections = self._select_destinations( [ 577.294241] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 577.294241] nova-conductor[52625]: selections = self._schedule( [ 577.294241] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 577.294241] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 577.294241] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 577.294241] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 577.294241] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 577.294241] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 577.295509] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-da482807-d530-42bf-a8d1-b05480b89c84 tempest-ServersTestManualDisk-2067927717 tempest-ServersTestManualDisk-2067927717-project-member] [instance: 382a0fd9-2011-4876-9a0d-98339803d5c5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 578.937705] nova-conductor[52626]: Traceback (most recent call last): [ 578.937705] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 578.937705] nova-conductor[52626]: return func(*args, **kwargs) [ 578.937705] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 578.937705] nova-conductor[52626]: selections = self._select_destinations( [ 578.937705] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 578.937705] nova-conductor[52626]: selections = self._schedule( [ 578.937705] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 578.937705] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 578.937705] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 578.937705] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 578.937705] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 578.937705] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 578.938599] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.939147] nova-conductor[52626]: ERROR nova.conductor.manager [ 578.947159] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 578.947569] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 578.947867] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 578.994330] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] [instance: 2c071322-8cce-42d3-bfad-db97b5c5664a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 578.995305] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 578.995648] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 578.996756] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 578.999065] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 578.999065] nova-conductor[52626]: Traceback (most recent call last): [ 578.999065] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 578.999065] nova-conductor[52626]: return func(*args, **kwargs) [ 578.999065] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 578.999065] nova-conductor[52626]: selections = self._select_destinations( [ 578.999065] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 578.999065] nova-conductor[52626]: selections = self._schedule( [ 578.999065] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 578.999065] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 578.999065] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 578.999065] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 578.999065] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 578.999065] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 579.001434] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-489539b7-699b-444f-a93f-f74058b8674c tempest-ServersAaction247Test-1220264343 tempest-ServersAaction247Test-1220264343-project-member] [instance: 2c071322-8cce-42d3-bfad-db97b5c5664a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 580.407076] nova-conductor[52625]: Traceback (most recent call last): [ 580.407076] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 580.407076] nova-conductor[52625]: return func(*args, **kwargs) [ 580.407076] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 580.407076] nova-conductor[52625]: selections = self._select_destinations( [ 580.407076] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 580.407076] nova-conductor[52625]: selections = self._schedule( [ 580.407076] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 580.407076] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 580.407076] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 580.407076] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 580.407076] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 580.407076] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 580.407797] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.408289] nova-conductor[52625]: ERROR nova.conductor.manager [ 580.421540] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.421802] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 580.421965] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 580.485493] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] [instance: 61a91bce-f6c6-45d4-8ce0-33a9a7873fa6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 580.486437] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.486680] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 580.486856] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 580.490618] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 580.490618] nova-conductor[52625]: Traceback (most recent call last): [ 580.490618] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 580.490618] nova-conductor[52625]: return func(*args, **kwargs) [ 580.490618] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 580.490618] nova-conductor[52625]: selections = self._select_destinations( [ 580.490618] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 580.490618] nova-conductor[52625]: selections = self._schedule( [ 580.490618] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 580.490618] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 580.490618] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 580.490618] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 580.490618] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 580.490618] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 580.493016] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-f5c7aa06-60ad-4b7b-ac35-d3a036be1902 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665 tempest-FloatingIPsAssociationNegativeTestJSON-1030961665-project-member] [instance: 61a91bce-f6c6-45d4-8ce0-33a9a7873fa6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.349714] nova-conductor[52626]: Traceback (most recent call last): [ 581.349714] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 581.349714] nova-conductor[52626]: return func(*args, **kwargs) [ 581.349714] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 581.349714] nova-conductor[52626]: selections = self._select_destinations( [ 581.349714] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 581.349714] nova-conductor[52626]: selections = self._schedule( [ 581.349714] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 581.349714] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 581.349714] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 581.349714] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 581.349714] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 581.349714] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 581.350575] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.351171] nova-conductor[52626]: ERROR nova.conductor.manager [ 581.361377] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 581.361737] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 581.362064] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 581.419654] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] [instance: d4b3341a-8a31-4775-9a3d-4fa01b329768] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 581.419654] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 581.419654] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 581.419900] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 581.422527] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 581.422527] nova-conductor[52626]: Traceback (most recent call last): [ 581.422527] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 581.422527] nova-conductor[52626]: return func(*args, **kwargs) [ 581.422527] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 581.422527] nova-conductor[52626]: selections = self._select_destinations( [ 581.422527] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 581.422527] nova-conductor[52626]: selections = self._schedule( [ 581.422527] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 581.422527] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 581.422527] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 581.422527] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 581.422527] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 581.422527] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 581.423947] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-7451d85c-0306-4dfa-92a7-224a16738578 tempest-TenantUsagesTestJSON-822390215 tempest-TenantUsagesTestJSON-822390215-project-member] [instance: d4b3341a-8a31-4775-9a3d-4fa01b329768] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 585.969588] nova-conductor[52625]: Traceback (most recent call last): [ 585.969588] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 585.969588] nova-conductor[52625]: return func(*args, **kwargs) [ 585.969588] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 585.969588] nova-conductor[52625]: selections = self._select_destinations( [ 585.969588] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 585.969588] nova-conductor[52625]: selections = self._schedule( [ 585.969588] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 585.969588] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 585.969588] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 585.969588] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 585.969588] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 585.969588] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 585.970418] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.970960] nova-conductor[52625]: ERROR nova.conductor.manager [ 585.981186] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.981483] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.981716] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.061935] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] [instance: a75fa00c-6ff7-48aa-8440-cdca38f2ed9b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 586.062703] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.063985] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.063985] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.068957] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 586.068957] nova-conductor[52625]: Traceback (most recent call last): [ 586.068957] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 586.068957] nova-conductor[52625]: return func(*args, **kwargs) [ 586.068957] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 586.068957] nova-conductor[52625]: selections = self._select_destinations( [ 586.068957] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 586.068957] nova-conductor[52625]: selections = self._schedule( [ 586.068957] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 586.068957] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 586.068957] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 586.068957] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 586.068957] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 586.068957] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 586.069752] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-bb4f5949-631d-48d8-9cdd-125f6affc535 tempest-ServersWithSpecificFlavorTestJSON-1868219997 tempest-ServersWithSpecificFlavorTestJSON-1868219997-project-member] [instance: a75fa00c-6ff7-48aa-8440-cdca38f2ed9b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.767496] nova-conductor[52626]: Traceback (most recent call last): [ 591.767496] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 591.767496] nova-conductor[52626]: return func(*args, **kwargs) [ 591.767496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 591.767496] nova-conductor[52626]: selections = self._select_destinations( [ 591.767496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 591.767496] nova-conductor[52626]: selections = self._schedule( [ 591.767496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 591.767496] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 591.767496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 591.767496] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 591.767496] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 591.767496] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 591.769143] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.769686] nova-conductor[52626]: ERROR nova.conductor.manager [ 591.781274] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.781518] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.781689] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.873923] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] [instance: 23f550a7-4da0-470a-84ad-bd77833e6312] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 591.873923] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.873923] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.874236] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.876927] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 591.876927] nova-conductor[52626]: Traceback (most recent call last): [ 591.876927] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 591.876927] nova-conductor[52626]: return func(*args, **kwargs) [ 591.876927] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 591.876927] nova-conductor[52626]: selections = self._select_destinations( [ 591.876927] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 591.876927] nova-conductor[52626]: selections = self._schedule( [ 591.876927] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 591.876927] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 591.876927] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 591.876927] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 591.876927] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 591.876927] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 591.877517] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-cb5a058b-6c7d-450a-b644-c49d2faaf8a1 tempest-ServerAddressesNegativeTestJSON-302997532 tempest-ServerAddressesNegativeTestJSON-302997532-project-member] [instance: 23f550a7-4da0-470a-84ad-bd77833e6312] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 596.860889] nova-conductor[52625]: Traceback (most recent call last): [ 596.860889] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 596.860889] nova-conductor[52625]: return func(*args, **kwargs) [ 596.860889] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 596.860889] nova-conductor[52625]: selections = self._select_destinations( [ 596.860889] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 596.860889] nova-conductor[52625]: selections = self._schedule( [ 596.860889] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 596.860889] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 596.860889] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 596.860889] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 596.860889] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 596.860889] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 596.861800] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.862417] nova-conductor[52625]: ERROR nova.conductor.manager [ 596.870489] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.870772] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.870938] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.931035] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] [instance: 46d2ecfa-0654-4847-a7d2-2b0540026bed] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 596.931035] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.931035] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.931388] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.936180] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 596.936180] nova-conductor[52625]: Traceback (most recent call last): [ 596.936180] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 596.936180] nova-conductor[52625]: return func(*args, **kwargs) [ 596.936180] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 596.936180] nova-conductor[52625]: selections = self._select_destinations( [ 596.936180] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 596.936180] nova-conductor[52625]: selections = self._schedule( [ 596.936180] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 596.936180] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 596.936180] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 596.936180] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 596.936180] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 596.936180] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 596.936180] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-21495db4-108d-4ea1-9639-3ffe44898f28 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] [instance: 46d2ecfa-0654-4847-a7d2-2b0540026bed] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 598.854048] nova-conductor[52626]: Traceback (most recent call last): [ 598.854048] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 598.854048] nova-conductor[52626]: return func(*args, **kwargs) [ 598.854048] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 598.854048] nova-conductor[52626]: selections = self._select_destinations( [ 598.854048] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 598.854048] nova-conductor[52626]: selections = self._schedule( [ 598.854048] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 598.854048] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 598.854048] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 598.854048] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 598.854048] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 598.854048] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 598.855633] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.856680] nova-conductor[52626]: ERROR nova.conductor.manager [ 598.863462] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.863462] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.863462] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.939219] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] [instance: 42a7ac29-e239-47b9-96d7-b4422088eb91] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 598.939951] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.941043] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.941043] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.947200] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 598.947200] nova-conductor[52626]: Traceback (most recent call last): [ 598.947200] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 598.947200] nova-conductor[52626]: return func(*args, **kwargs) [ 598.947200] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 598.947200] nova-conductor[52626]: selections = self._select_destinations( [ 598.947200] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 598.947200] nova-conductor[52626]: selections = self._schedule( [ 598.947200] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 598.947200] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 598.947200] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 598.947200] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 598.947200] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 598.947200] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 598.947200] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-ca350b48-bd30-45e9-aa15-14e6a61db3ae tempest-AttachInterfacesV270Test-827110292 tempest-AttachInterfacesV270Test-827110292-project-member] [instance: 42a7ac29-e239-47b9-96d7-b4422088eb91] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.387166] nova-conductor[52625]: Traceback (most recent call last): [ 600.387166] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 600.387166] nova-conductor[52625]: return func(*args, **kwargs) [ 600.387166] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 600.387166] nova-conductor[52625]: selections = self._select_destinations( [ 600.387166] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 600.387166] nova-conductor[52625]: selections = self._schedule( [ 600.387166] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 600.387166] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 600.387166] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 600.387166] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 600.387166] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 600.387166] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 600.387963] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.388633] nova-conductor[52625]: ERROR nova.conductor.manager [ 600.397417] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.397958] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.002s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.398966] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.461774] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] [instance: ad2f72ea-def7-4a0b-98f3-6c4b3b2f6ccf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 600.461774] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.461873] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.461982] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.466016] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 600.466016] nova-conductor[52625]: Traceback (most recent call last): [ 600.466016] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 600.466016] nova-conductor[52625]: return func(*args, **kwargs) [ 600.466016] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 600.466016] nova-conductor[52625]: selections = self._select_destinations( [ 600.466016] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 600.466016] nova-conductor[52625]: selections = self._schedule( [ 600.466016] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 600.466016] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 600.466016] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 600.466016] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 600.466016] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 600.466016] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.466608] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-8724b41c-92ee-4a7f-abfb-bfefa577bbb1 tempest-ServersAdminTestJSON-244411966 tempest-ServersAdminTestJSON-244411966-project-member] [instance: ad2f72ea-def7-4a0b-98f3-6c4b3b2f6ccf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.896138] nova-conductor[52626]: Traceback (most recent call last): [ 600.896138] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 600.896138] nova-conductor[52626]: return func(*args, **kwargs) [ 600.896138] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 600.896138] nova-conductor[52626]: selections = self._select_destinations( [ 600.896138] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 600.896138] nova-conductor[52626]: selections = self._schedule( [ 600.896138] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 600.896138] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 600.896138] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 600.896138] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 600.896138] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 600.896138] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 600.896933] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.897821] nova-conductor[52626]: ERROR nova.conductor.manager [ 600.902297] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.902612] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.902861] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.958618] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] [instance: caabf911-e810-434e-8890-ad7ca28333b6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 600.959675] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.959884] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.959884] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.965023] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 600.965023] nova-conductor[52626]: Traceback (most recent call last): [ 600.965023] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 600.965023] nova-conductor[52626]: return func(*args, **kwargs) [ 600.965023] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 600.965023] nova-conductor[52626]: selections = self._select_destinations( [ 600.965023] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 600.965023] nova-conductor[52626]: selections = self._schedule( [ 600.965023] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 600.965023] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 600.965023] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 600.965023] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 600.965023] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 600.965023] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 600.965023] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-9de4ee3f-15c9-4e5d-a3a8-257c6b665ef6 tempest-ServerRescueTestJSONUnderV235-164382788 tempest-ServerRescueTestJSONUnderV235-164382788-project-member] [instance: caabf911-e810-434e-8890-ad7ca28333b6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 601.908056] nova-conductor[52625]: Traceback (most recent call last): [ 601.908056] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 601.908056] nova-conductor[52625]: return func(*args, **kwargs) [ 601.908056] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 601.908056] nova-conductor[52625]: selections = self._select_destinations( [ 601.908056] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 601.908056] nova-conductor[52625]: selections = self._schedule( [ 601.908056] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 601.908056] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 601.908056] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 601.908056] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 601.908056] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 601.908056] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 601.908870] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.909613] nova-conductor[52625]: ERROR nova.conductor.manager [ 601.918933] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 601.919390] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 601.919972] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 601.981695] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: cd90ce23-36e0-4b2a-a3ae-edab16ccca56] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 601.982477] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 601.983068] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 601.983068] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 601.987603] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 601.987603] nova-conductor[52625]: Traceback (most recent call last): [ 601.987603] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 601.987603] nova-conductor[52625]: return func(*args, **kwargs) [ 601.987603] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 601.987603] nova-conductor[52625]: selections = self._select_destinations( [ 601.987603] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 601.987603] nova-conductor[52625]: selections = self._schedule( [ 601.987603] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 601.987603] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 601.987603] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 601.987603] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 601.987603] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 601.987603] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 601.988146] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-e90e3686-c1a7-4886-a03c-7617278a804d tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: cd90ce23-36e0-4b2a-a3ae-edab16ccca56] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 603.125773] nova-conductor[52626]: Traceback (most recent call last): [ 603.125773] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 603.125773] nova-conductor[52626]: return func(*args, **kwargs) [ 603.125773] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 603.125773] nova-conductor[52626]: selections = self._select_destinations( [ 603.125773] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 603.125773] nova-conductor[52626]: selections = self._schedule( [ 603.125773] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 603.125773] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 603.125773] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 603.125773] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 603.125773] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 603.125773] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 603.127406] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.128198] nova-conductor[52626]: ERROR nova.conductor.manager [ 603.134753] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.134936] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.135100] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.203437] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] [instance: 1a2761ee-a517-45f6-aecf-12bc8ca9275b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 603.204164] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.204164] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.205855] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.211580] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 603.211580] nova-conductor[52626]: Traceback (most recent call last): [ 603.211580] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 603.211580] nova-conductor[52626]: return func(*args, **kwargs) [ 603.211580] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 603.211580] nova-conductor[52626]: selections = self._select_destinations( [ 603.211580] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 603.211580] nova-conductor[52626]: selections = self._schedule( [ 603.211580] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 603.211580] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 603.211580] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 603.211580] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 603.211580] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 603.211580] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 603.213307] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-11de9b5a-2e88-44d2-975e-fd3fe8e772e8 tempest-AttachInterfacesUnderV243Test-936684357 tempest-AttachInterfacesUnderV243Test-936684357-project-member] [instance: 1a2761ee-a517-45f6-aecf-12bc8ca9275b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 604.042218] nova-conductor[52625]: Traceback (most recent call last): [ 604.042218] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 604.042218] nova-conductor[52625]: return func(*args, **kwargs) [ 604.042218] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 604.042218] nova-conductor[52625]: selections = self._select_destinations( [ 604.042218] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 604.042218] nova-conductor[52625]: selections = self._schedule( [ 604.042218] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 604.042218] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 604.042218] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 604.042218] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 604.042218] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 604.042218] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 604.043055] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.043815] nova-conductor[52625]: ERROR nova.conductor.manager [ 604.053900] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.054282] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.054366] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.108937] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: 9e3ec347-5a8b-448a-a8c8-6ffbd230edd9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 604.109762] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.109762] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.109762] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.113259] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 604.113259] nova-conductor[52625]: Traceback (most recent call last): [ 604.113259] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 604.113259] nova-conductor[52625]: return func(*args, **kwargs) [ 604.113259] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 604.113259] nova-conductor[52625]: selections = self._select_destinations( [ 604.113259] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 604.113259] nova-conductor[52625]: selections = self._schedule( [ 604.113259] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 604.113259] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 604.113259] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 604.113259] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 604.113259] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 604.113259] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 604.114766] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: 9e3ec347-5a8b-448a-a8c8-6ffbd230edd9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 604.136570] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.136570] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.136570] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.179900] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: d74bc1d0-42f7-43aa-bdeb-a9153e3f86c6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 604.181026] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.181256] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.181439] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.184834] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 604.184834] nova-conductor[52625]: Traceback (most recent call last): [ 604.184834] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 604.184834] nova-conductor[52625]: return func(*args, **kwargs) [ 604.184834] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 604.184834] nova-conductor[52625]: selections = self._select_destinations( [ 604.184834] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 604.184834] nova-conductor[52625]: selections = self._schedule( [ 604.184834] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 604.184834] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 604.184834] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 604.184834] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 604.184834] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 604.184834] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 604.185386] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a7ada3e9-7975-4e3d-9e6d-b8fa69e5f7db tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: d74bc1d0-42f7-43aa-bdeb-a9153e3f86c6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 604.760136] nova-conductor[52625]: ERROR nova.scheduler.utils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance dedec08e-95d1-4467-96a4-cdec5f170e01 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 604.760914] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Rescheduling: True {{(pid=52625) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 604.761248] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance dedec08e-95d1-4467-96a4-cdec5f170e01.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance dedec08e-95d1-4467-96a4-cdec5f170e01. [ 604.761396] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance dedec08e-95d1-4467-96a4-cdec5f170e01. [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 609.080268] nova-conductor[52625]: Traceback (most recent call last): [ 609.080268] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 609.080268] nova-conductor[52625]: return func(*args, **kwargs) [ 609.080268] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 609.080268] nova-conductor[52625]: selections = self._select_destinations( [ 609.080268] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 609.080268] nova-conductor[52625]: selections = self._schedule( [ 609.080268] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 609.080268] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 609.080268] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 609.080268] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 609.080268] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.080268] nova-conductor[52625]: ERROR nova.conductor.manager [ 609.097308] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.097308] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.097308] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.186557] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] [instance: 58f148af-635e-4301-a5b3-7ff28343bff2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 609.187495] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.187623] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.187805] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.191222] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 609.191222] nova-conductor[52625]: Traceback (most recent call last): [ 609.191222] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 609.191222] nova-conductor[52625]: return func(*args, **kwargs) [ 609.191222] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 609.191222] nova-conductor[52625]: selections = self._select_destinations( [ 609.191222] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 609.191222] nova-conductor[52625]: selections = self._schedule( [ 609.191222] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 609.191222] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 609.191222] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 609.191222] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 609.191222] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 609.191222] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 609.191788] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-4f5737cd-431f-433f-b002-cdc24c219fc8 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] [instance: 58f148af-635e-4301-a5b3-7ff28343bff2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.215260] nova-conductor[52625]: Traceback (most recent call last): [ 613.215260] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 613.215260] nova-conductor[52625]: return func(*args, **kwargs) [ 613.215260] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 613.215260] nova-conductor[52625]: selections = self._select_destinations( [ 613.215260] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 613.215260] nova-conductor[52625]: selections = self._schedule( [ 613.215260] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 613.215260] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 613.215260] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 613.215260] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 613.215260] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.215260] nova-conductor[52625]: ERROR nova.conductor.manager [ 613.226074] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.228816] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.228816] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.277858] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] [instance: 3ba53776-d6bc-4af4-b9cf-4106deb3e32f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 613.278754] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.278990] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.279198] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.282497] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 613.282497] nova-conductor[52625]: Traceback (most recent call last): [ 613.282497] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 613.282497] nova-conductor[52625]: return func(*args, **kwargs) [ 613.282497] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 613.282497] nova-conductor[52625]: selections = self._select_destinations( [ 613.282497] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 613.282497] nova-conductor[52625]: selections = self._schedule( [ 613.282497] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 613.282497] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 613.282497] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 613.282497] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 613.282497] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 613.282497] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 613.283045] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-3ef54974-0673-46ef-a794-e729b950f340 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] [instance: 3ba53776-d6bc-4af4-b9cf-4106deb3e32f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.006978] nova-conductor[52626]: Traceback (most recent call last): [ 617.006978] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.006978] nova-conductor[52626]: return func(*args, **kwargs) [ 617.006978] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.006978] nova-conductor[52626]: selections = self._select_destinations( [ 617.006978] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.006978] nova-conductor[52626]: selections = self._schedule( [ 617.006978] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.006978] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 617.006978] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.006978] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 617.006978] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.006978] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.017131] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.017282] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.017443] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.069926] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] [instance: 6ece65dd-1b03-4916-9e1d-a5d135e575e8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.071834] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.071834] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.071834] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.075088] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 617.075088] nova-conductor[52626]: Traceback (most recent call last): [ 617.075088] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.075088] nova-conductor[52626]: return func(*args, **kwargs) [ 617.075088] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.075088] nova-conductor[52626]: selections = self._select_destinations( [ 617.075088] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.075088] nova-conductor[52626]: selections = self._schedule( [ 617.075088] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.075088] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 617.075088] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.075088] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 617.075088] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.075088] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.075737] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0d5d5b1d-2bbc-454f-a2a7-d67792bf381f tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] [instance: 6ece65dd-1b03-4916-9e1d-a5d135e575e8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.444716] nova-conductor[52625]: Traceback (most recent call last): [ 617.444716] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.444716] nova-conductor[52625]: return func(*args, **kwargs) [ 617.444716] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.444716] nova-conductor[52625]: selections = self._select_destinations( [ 617.444716] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.444716] nova-conductor[52625]: selections = self._schedule( [ 617.444716] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.444716] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 617.444716] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.444716] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 617.444716] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.444716] nova-conductor[52625]: ERROR nova.conductor.manager [ 617.451760] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.452013] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.452206] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.501130] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] [instance: 7f746826-2b36-4d9a-b98a-a84ce28b53d9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.502051] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.502201] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.502444] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.505728] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 617.505728] nova-conductor[52625]: Traceback (most recent call last): [ 617.505728] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.505728] nova-conductor[52625]: return func(*args, **kwargs) [ 617.505728] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.505728] nova-conductor[52625]: selections = self._select_destinations( [ 617.505728] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.505728] nova-conductor[52625]: selections = self._schedule( [ 617.505728] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.505728] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 617.505728] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.505728] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 617.505728] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.505728] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.506990] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] [instance: 7f746826-2b36-4d9a-b98a-a84ce28b53d9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.558297] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.558569] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.558856] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.626049] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] [instance: 1e1f9e54-519b-4817-8b45-7185c19a930a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.626847] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.627084] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.627259] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.631874] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 617.631874] nova-conductor[52625]: Traceback (most recent call last): [ 617.631874] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.631874] nova-conductor[52625]: return func(*args, **kwargs) [ 617.631874] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.631874] nova-conductor[52625]: selections = self._select_destinations( [ 617.631874] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.631874] nova-conductor[52625]: selections = self._schedule( [ 617.631874] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.631874] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 617.631874] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.631874] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 617.631874] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.631874] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.632476] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] [instance: 1e1f9e54-519b-4817-8b45-7185c19a930a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.664568] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.664969] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.665289] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.721869] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] [instance: bd48f71f-f20c-437a-86ad-fc9052774889] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.722609] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.722833] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.724098] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.726535] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 617.726535] nova-conductor[52625]: Traceback (most recent call last): [ 617.726535] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.726535] nova-conductor[52625]: return func(*args, **kwargs) [ 617.726535] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.726535] nova-conductor[52625]: selections = self._select_destinations( [ 617.726535] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.726535] nova-conductor[52625]: selections = self._schedule( [ 617.726535] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.726535] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 617.726535] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.726535] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 617.726535] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.726535] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.727660] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-052e2345-195e-4a18-8239-9b702bbf2d1b tempest-ListServersNegativeTestJSON-1678133107 tempest-ListServersNegativeTestJSON-1678133107-project-member] [instance: bd48f71f-f20c-437a-86ad-fc9052774889] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.834866] nova-conductor[52626]: Traceback (most recent call last): [ 617.834866] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.834866] nova-conductor[52626]: return func(*args, **kwargs) [ 617.834866] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.834866] nova-conductor[52626]: selections = self._select_destinations( [ 617.834866] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.834866] nova-conductor[52626]: selections = self._schedule( [ 617.834866] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.834866] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 617.834866] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.834866] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 617.834866] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.834866] nova-conductor[52626]: ERROR nova.conductor.manager [ 617.840915] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.840915] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.840915] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.905921] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] [instance: b04494ef-2de7-40ea-9e08-3826e05fdd05] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.906920] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.906920] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.907197] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.911151] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 617.911151] nova-conductor[52626]: Traceback (most recent call last): [ 617.911151] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 617.911151] nova-conductor[52626]: return func(*args, **kwargs) [ 617.911151] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 617.911151] nova-conductor[52626]: selections = self._select_destinations( [ 617.911151] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 617.911151] nova-conductor[52626]: selections = self._schedule( [ 617.911151] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 617.911151] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 617.911151] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 617.911151] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 617.911151] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 617.911151] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 617.911896] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0886599a-d786-48c1-abd4-2aa26f7ab034 tempest-ListServerFiltersTestJSON-237626882 tempest-ListServerFiltersTestJSON-237626882-project-member] [instance: b04494ef-2de7-40ea-9e08-3826e05fdd05] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 618.980023] nova-conductor[52625]: Traceback (most recent call last): [ 618.980023] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 618.980023] nova-conductor[52625]: return func(*args, **kwargs) [ 618.980023] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 618.980023] nova-conductor[52625]: selections = self._select_destinations( [ 618.980023] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 618.980023] nova-conductor[52625]: selections = self._schedule( [ 618.980023] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 618.980023] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 618.980023] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 618.980023] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 618.980023] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 618.980023] nova-conductor[52625]: ERROR nova.conductor.manager [ 619.006413] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.006900] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.007306] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 619.019393] nova-conductor[52626]: Traceback (most recent call last): [ 619.019393] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 619.019393] nova-conductor[52626]: return func(*args, **kwargs) [ 619.019393] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 619.019393] nova-conductor[52626]: selections = self._select_destinations( [ 619.019393] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 619.019393] nova-conductor[52626]: selections = self._schedule( [ 619.019393] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 619.019393] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 619.019393] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 619.019393] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 619.019393] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.019393] nova-conductor[52626]: ERROR nova.conductor.manager [ 619.031899] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.032354] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.032700] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.088909] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: a10fb114-7343-49d8-a49c-0ab8e58f6f93] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 619.094803] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.095226] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.003s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.095730] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.106533] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 619.106533] nova-conductor[52625]: Traceback (most recent call last): [ 619.106533] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 619.106533] nova-conductor[52625]: return func(*args, **kwargs) [ 619.106533] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 619.106533] nova-conductor[52625]: selections = self._select_destinations( [ 619.106533] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 619.106533] nova-conductor[52625]: selections = self._schedule( [ 619.106533] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 619.106533] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 619.106533] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 619.106533] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 619.106533] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 619.106533] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 619.107096] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-8dd6c07e-ed46-4c66-a0ca-270df89396bc tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: a10fb114-7343-49d8-a49c-0ab8e58f6f93] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 619.109338] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: f3bedec4-adc0-4c9c-9716-601a696e1fa2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 619.110397] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.110397] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.110517] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.113708] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 619.113708] nova-conductor[52626]: Traceback (most recent call last): [ 619.113708] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 619.113708] nova-conductor[52626]: return func(*args, **kwargs) [ 619.113708] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 619.113708] nova-conductor[52626]: selections = self._select_destinations( [ 619.113708] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 619.113708] nova-conductor[52626]: selections = self._schedule( [ 619.113708] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 619.113708] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 619.113708] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 619.113708] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 619.113708] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 619.113708] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 619.114233] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0c681468-d5ef-4854-9115-7dcd85808a43 tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: f3bedec4-adc0-4c9c-9716-601a696e1fa2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.231265] nova-conductor[52625]: Traceback (most recent call last): [ 623.231265] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.231265] nova-conductor[52625]: return func(*args, **kwargs) [ 623.231265] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.231265] nova-conductor[52625]: selections = self._select_destinations( [ 623.231265] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.231265] nova-conductor[52625]: selections = self._schedule( [ 623.231265] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.231265] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 623.231265] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.231265] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 623.231265] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.231265] nova-conductor[52625]: ERROR nova.conductor.manager [ 623.240797] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.241129] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.241275] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.292903] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] [instance: f13ef27f-29eb-46fe-9a66-25e9ae41e697] block_device_mapping [BlockDeviceMapping(attachment_id=9cc57a5e-d7b9-4716-a4c7-0424cb151fa2,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='219d99c5-f486-423b-a2cf-2a7dfca05e57',volume_size=1,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 623.293603] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.293832] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.294027] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.297152] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 623.297152] nova-conductor[52625]: Traceback (most recent call last): [ 623.297152] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.297152] nova-conductor[52625]: return func(*args, **kwargs) [ 623.297152] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.297152] nova-conductor[52625]: selections = self._select_destinations( [ 623.297152] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.297152] nova-conductor[52625]: selections = self._schedule( [ 623.297152] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.297152] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 623.297152] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.297152] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 623.297152] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.297152] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.297667] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-166dcdf8-b493-45db-b2b2-a1e507650389 tempest-ServersTestBootFromVolume-835355594 tempest-ServersTestBootFromVolume-835355594-project-member] [instance: f13ef27f-29eb-46fe-9a66-25e9ae41e697] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.298051] nova-conductor[52626]: Traceback (most recent call last): [ 625.298051] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 625.298051] nova-conductor[52626]: return func(*args, **kwargs) [ 625.298051] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 625.298051] nova-conductor[52626]: selections = self._select_destinations( [ 625.298051] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 625.298051] nova-conductor[52626]: selections = self._schedule( [ 625.298051] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 625.298051] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 625.298051] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 625.298051] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 625.298051] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.298051] nova-conductor[52626]: ERROR nova.conductor.manager [ 625.309020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.309020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.309020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.359099] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 83a0e83a-56c0-4ba7-9000-e99bd4c9b68d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 625.359099] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.359099] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.359099] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.361914] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 625.361914] nova-conductor[52626]: Traceback (most recent call last): [ 625.361914] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 625.361914] nova-conductor[52626]: return func(*args, **kwargs) [ 625.361914] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 625.361914] nova-conductor[52626]: selections = self._select_destinations( [ 625.361914] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 625.361914] nova-conductor[52626]: selections = self._schedule( [ 625.361914] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 625.361914] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 625.361914] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 625.361914] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 625.361914] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 625.361914] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 625.363087] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-e026ed9f-e3a5-42fa-a3fd-5c7f51c19bb2 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 83a0e83a-56c0-4ba7-9000-e99bd4c9b68d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.090336] nova-conductor[52626]: Traceback (most recent call last): [ 626.090336] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 626.090336] nova-conductor[52626]: return func(*args, **kwargs) [ 626.090336] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 626.090336] nova-conductor[52626]: selections = self._select_destinations( [ 626.090336] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 626.090336] nova-conductor[52626]: selections = self._schedule( [ 626.090336] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 626.090336] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 626.090336] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 626.090336] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 626.090336] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.090336] nova-conductor[52626]: ERROR nova.conductor.manager [ 626.097958] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.098261] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.098481] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.145209] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] [instance: 2cb511a1-8e1c-453a-8699-eaadc297558a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 626.145994] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.146224] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.146406] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.149497] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 626.149497] nova-conductor[52626]: Traceback (most recent call last): [ 626.149497] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 626.149497] nova-conductor[52626]: return func(*args, **kwargs) [ 626.149497] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 626.149497] nova-conductor[52626]: selections = self._select_destinations( [ 626.149497] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 626.149497] nova-conductor[52626]: selections = self._schedule( [ 626.149497] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 626.149497] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 626.149497] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 626.149497] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 626.149497] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 626.149497] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.150278] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-c3835489-dcd4-4e48-b922-a810353ed78f tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] [instance: 2cb511a1-8e1c-453a-8699-eaadc297558a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.458340] nova-conductor[52625]: Traceback (most recent call last): [ 626.458340] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 626.458340] nova-conductor[52625]: return func(*args, **kwargs) [ 626.458340] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 626.458340] nova-conductor[52625]: selections = self._select_destinations( [ 626.458340] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 626.458340] nova-conductor[52625]: selections = self._schedule( [ 626.458340] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 626.458340] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 626.458340] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 626.458340] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 626.458340] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.458340] nova-conductor[52625]: ERROR nova.conductor.manager [ 626.468929] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.468929] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.469087] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.548587] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] [instance: bcde62fe-6dce-44cf-8e62-56aaebf0fdde] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 626.549392] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.549767] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.549899] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.553020] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 626.553020] nova-conductor[52625]: Traceback (most recent call last): [ 626.553020] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 626.553020] nova-conductor[52625]: return func(*args, **kwargs) [ 626.553020] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 626.553020] nova-conductor[52625]: selections = self._select_destinations( [ 626.553020] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 626.553020] nova-conductor[52625]: selections = self._schedule( [ 626.553020] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 626.553020] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 626.553020] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 626.553020] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 626.553020] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 626.553020] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 626.553609] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-666bd3e5-ee7f-414f-9992-71b046cd3690 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] [instance: bcde62fe-6dce-44cf-8e62-56aaebf0fdde] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.515063] nova-conductor[52626]: Traceback (most recent call last): [ 627.515063] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.515063] nova-conductor[52626]: return func(*args, **kwargs) [ 627.515063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.515063] nova-conductor[52626]: selections = self._select_destinations( [ 627.515063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.515063] nova-conductor[52626]: selections = self._schedule( [ 627.515063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.515063] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 627.515063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.515063] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 627.515063] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.515063] nova-conductor[52626]: ERROR nova.conductor.manager [ 627.523949] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.524286] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.526845] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.548995] nova-conductor[52625]: Traceback (most recent call last): [ 627.548995] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.548995] nova-conductor[52625]: return func(*args, **kwargs) [ 627.548995] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.548995] nova-conductor[52625]: selections = self._select_destinations( [ 627.548995] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.548995] nova-conductor[52625]: selections = self._schedule( [ 627.548995] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.548995] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 627.548995] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.548995] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 627.548995] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.548995] nova-conductor[52625]: ERROR nova.conductor.manager [ 627.560158] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.560407] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.560589] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.582920] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] [instance: 8f1eddd4-44b9-4dbc-8b51-edb1eb0a27e3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.583624] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.583826] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.584010] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.588700] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 627.588700] nova-conductor[52626]: Traceback (most recent call last): [ 627.588700] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.588700] nova-conductor[52626]: return func(*args, **kwargs) [ 627.588700] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.588700] nova-conductor[52626]: selections = self._select_destinations( [ 627.588700] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.588700] nova-conductor[52626]: selections = self._schedule( [ 627.588700] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.588700] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 627.588700] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.588700] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 627.588700] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.588700] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.589263] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-aaba7d08-c9b1-4ae9-a146-2eb8b65c5718 tempest-ServersNegativeTestMultiTenantJSON-88046137 tempest-ServersNegativeTestMultiTenantJSON-88046137-project-member] [instance: 8f1eddd4-44b9-4dbc-8b51-edb1eb0a27e3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.602944] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: 0c41974b-788e-4075-b634-74af5f749110] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.603699] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.604032] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.604255] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.608759] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 627.608759] nova-conductor[52625]: Traceback (most recent call last): [ 627.608759] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.608759] nova-conductor[52625]: return func(*args, **kwargs) [ 627.608759] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.608759] nova-conductor[52625]: selections = self._select_destinations( [ 627.608759] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.608759] nova-conductor[52625]: selections = self._schedule( [ 627.608759] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.608759] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 627.608759] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.608759] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 627.608759] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.608759] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.609332] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: 0c41974b-788e-4075-b634-74af5f749110] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.632163] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.632407] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.632608] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.671204] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: 0ec0f5ae-f803-4557-b6f0-24713eed7caf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.672184] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.672630] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.672898] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.677766] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 627.677766] nova-conductor[52625]: Traceback (most recent call last): [ 627.677766] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.677766] nova-conductor[52625]: return func(*args, **kwargs) [ 627.677766] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.677766] nova-conductor[52625]: selections = self._select_destinations( [ 627.677766] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.677766] nova-conductor[52625]: selections = self._schedule( [ 627.677766] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.677766] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 627.677766] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.677766] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 627.677766] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.677766] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.678514] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-2673b856-6830-42da-9247-e6ef15daf57b tempest-MultipleCreateTestJSON-1489643269 tempest-MultipleCreateTestJSON-1489643269-project-member] [instance: 0ec0f5ae-f803-4557-b6f0-24713eed7caf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.592640] nova-conductor[52626]: Traceback (most recent call last): [ 628.592640] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 628.592640] nova-conductor[52626]: return func(*args, **kwargs) [ 628.592640] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 628.592640] nova-conductor[52626]: selections = self._select_destinations( [ 628.592640] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 628.592640] nova-conductor[52626]: selections = self._schedule( [ 628.592640] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 628.592640] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 628.592640] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 628.592640] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 628.592640] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.592640] nova-conductor[52626]: ERROR nova.conductor.manager [ 628.600193] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.600449] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.600603] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.649822] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] [instance: 81bdb4de-7032-481f-b2fe-a00763dece07] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 628.650844] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.650844] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.650968] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.654035] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 628.654035] nova-conductor[52626]: Traceback (most recent call last): [ 628.654035] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 628.654035] nova-conductor[52626]: return func(*args, **kwargs) [ 628.654035] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 628.654035] nova-conductor[52626]: selections = self._select_destinations( [ 628.654035] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 628.654035] nova-conductor[52626]: selections = self._schedule( [ 628.654035] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 628.654035] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 628.654035] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 628.654035] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 628.654035] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 628.654035] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.654554] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0614cc10-cb93-4dc3-9289-64d1295b94dd tempest-ServerActionsTestJSON-169140657 tempest-ServerActionsTestJSON-169140657-project-member] [instance: 81bdb4de-7032-481f-b2fe-a00763dece07] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 629.609586] nova-conductor[52625]: Traceback (most recent call last): [ 629.609586] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 629.609586] nova-conductor[52625]: return func(*args, **kwargs) [ 629.609586] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 629.609586] nova-conductor[52625]: selections = self._select_destinations( [ 629.609586] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 629.609586] nova-conductor[52625]: selections = self._schedule( [ 629.609586] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 629.609586] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 629.609586] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 629.609586] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 629.609586] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.609586] nova-conductor[52625]: ERROR nova.conductor.manager [ 629.616540] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.616693] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.617020] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.656911] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] [instance: 0b3f8669-f9d2-44d2-8306-b7ca2d9e350b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 629.657636] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.657853] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.658041] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.660996] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 629.660996] nova-conductor[52625]: Traceback (most recent call last): [ 629.660996] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 629.660996] nova-conductor[52625]: return func(*args, **kwargs) [ 629.660996] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 629.660996] nova-conductor[52625]: selections = self._select_destinations( [ 629.660996] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 629.660996] nova-conductor[52625]: selections = self._schedule( [ 629.660996] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 629.660996] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 629.660996] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 629.660996] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 629.660996] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 629.660996] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 629.661534] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-7e45b34e-7dc4-4f84-b605-57329cd60096 tempest-ListImageFiltersTestJSON-803728972 tempest-ListImageFiltersTestJSON-803728972-project-member] [instance: 0b3f8669-f9d2-44d2-8306-b7ca2d9e350b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 630.716582] nova-conductor[52626]: Traceback (most recent call last): [ 630.716582] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 630.716582] nova-conductor[52626]: return func(*args, **kwargs) [ 630.716582] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 630.716582] nova-conductor[52626]: selections = self._select_destinations( [ 630.716582] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 630.716582] nova-conductor[52626]: selections = self._schedule( [ 630.716582] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 630.716582] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 630.716582] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 630.716582] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 630.716582] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.716582] nova-conductor[52626]: ERROR nova.conductor.manager [ 630.723112] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.723350] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.723527] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.766383] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] [instance: 9729a60c-89a5-47a2-bbf9-d86c891b7731] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 630.766383] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.766383] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.766383] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.774028] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 630.774028] nova-conductor[52626]: Traceback (most recent call last): [ 630.774028] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 630.774028] nova-conductor[52626]: return func(*args, **kwargs) [ 630.774028] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 630.774028] nova-conductor[52626]: selections = self._select_destinations( [ 630.774028] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 630.774028] nova-conductor[52626]: selections = self._schedule( [ 630.774028] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 630.774028] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 630.774028] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 630.774028] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 630.774028] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 630.774028] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 630.774028] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-e2062c2d-80b0-4773-bde4-1350aa976f5b tempest-VolumesAdminNegativeTest-480231075 tempest-VolumesAdminNegativeTest-480231075-project-member] [instance: 9729a60c-89a5-47a2-bbf9-d86c891b7731] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.182303] nova-conductor[52625]: Traceback (most recent call last): [ 631.182303] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.182303] nova-conductor[52625]: return func(*args, **kwargs) [ 631.182303] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.182303] nova-conductor[52625]: selections = self._select_destinations( [ 631.182303] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.182303] nova-conductor[52625]: selections = self._schedule( [ 631.182303] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.182303] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 631.182303] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.182303] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 631.182303] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.182303] nova-conductor[52625]: ERROR nova.conductor.manager [ 631.189410] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.189669] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.190133] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.235869] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: c0c7b9a2-4850-418f-8cda-54e7690b5880] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.236630] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.236936] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.237149] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.241078] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 631.241078] nova-conductor[52625]: Traceback (most recent call last): [ 631.241078] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.241078] nova-conductor[52625]: return func(*args, **kwargs) [ 631.241078] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.241078] nova-conductor[52625]: selections = self._select_destinations( [ 631.241078] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.241078] nova-conductor[52625]: selections = self._schedule( [ 631.241078] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.241078] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 631.241078] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.241078] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 631.241078] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.241078] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.241655] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-45dfa503-7730-4d31-876f-a7c8d960727d tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: c0c7b9a2-4850-418f-8cda-54e7690b5880] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.925425] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 635.944713] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.944713] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.944713] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.981587] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.981834] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.982029] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.982400] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.982595] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.982763] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.992046] nova-conductor[52625]: DEBUG nova.quota [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Getting quotas for project 94fa7bddb7f64f01baa46ea6cba2bdb1. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 636.000042] nova-conductor[52625]: DEBUG nova.quota [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Getting quotas for user cd7795b536d448fb9aed5ab18496fc5e and project 94fa7bddb7f64f01baa46ea6cba2bdb1. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 636.005343] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 636.005838] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.006160] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.006324] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.012321] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 636.013171] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.013456] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.013647] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.027439] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.028983] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.028983] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.863799] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 637.884935] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.885327] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.885609] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.936208] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.936451] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.937150] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.937700] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.937893] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.938156] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.947708] nova-conductor[52626]: DEBUG nova.quota [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Getting quotas for project 58b9dd37c4a94f59a5afc2c931ee30a9. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 637.950025] nova-conductor[52626]: DEBUG nova.quota [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Getting quotas for user 81d0dfe7783f4cfebc10dafb19a456fc and project 58b9dd37c4a94f59a5afc2c931ee30a9. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 637.955990] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 637.956411] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.956615] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.956782] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.960532] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 637.961215] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.961417] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.961585] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.977625] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.977845] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.978196] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.204391] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 641.220834] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.221220] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.221444] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.259335] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.259625] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.259753] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.260261] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.260971] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.260971] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.271364] nova-conductor[52625]: DEBUG nova.quota [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Getting quotas for project ae96503b01a9442f96d122810ca18d88. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 641.273376] nova-conductor[52625]: DEBUG nova.quota [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Getting quotas for user a1b27a07ed01451683d91d3795f68ce4 and project ae96503b01a9442f96d122810ca18d88. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 641.281375] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 641.281678] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.281889] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.282074] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.284884] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 641.285569] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.285784] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.285959] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.299526] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.299716] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.299845] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.180989] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 644.203748] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.204141] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.204337] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.248434] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.248671] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.248848] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.249265] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.249457] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.249622] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.259518] nova-conductor[52626]: DEBUG nova.quota [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Getting quotas for project 30c26c3c4591499e82e430e68f2889ef. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 644.261750] nova-conductor[52626]: DEBUG nova.quota [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Getting quotas for user dc147e2d92aa41bc9c7757eaa9adb7a4 and project 30c26c3c4591499e82e430e68f2889ef. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 644.267983] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 644.268542] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.268749] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.268922] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.271809] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] block_device_mapping [BlockDeviceMapping(attachment_id=2f835c19-72af-4392-bef3-596395255df5,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0',volume_size=1,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 644.272493] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.272701] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.272870] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.294229] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.294473] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.294956] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.438618] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 644.451829] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.451829] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.451829] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.488641] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.488893] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.489144] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.489575] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.489712] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.489878] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.503735] nova-conductor[52625]: DEBUG nova.quota [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Getting quotas for project 2c0ae2f740df40b6a1987a5eb2d51803. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 644.506246] nova-conductor[52625]: DEBUG nova.quota [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Getting quotas for user 9aa8f9a3c0224a47a235d962f79cba8f and project 2c0ae2f740df40b6a1987a5eb2d51803. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 644.518082] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 644.518754] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.519125] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.519409] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.528251] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 644.528251] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.528251] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.528251] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.543479] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.545031] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.545031] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.423472] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 648.433937] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.434191] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.434358] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.469020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.469020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.469020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.469020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.469020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.469020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.482873] nova-conductor[52626]: DEBUG nova.quota [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Getting quotas for project 139803ec0f68419486debc36866b12f6. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 648.487835] nova-conductor[52626]: DEBUG nova.quota [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Getting quotas for user c1cc19a5a50a4d33b15a10c1cd4ee637 and project 139803ec0f68419486debc36866b12f6. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 648.496867] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] [instance: 6c94c59c-44ab-4cb9-8480-18e8a424993b] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 648.498608] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.498826] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.499009] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.502310] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] [instance: 6c94c59c-44ab-4cb9-8480-18e8a424993b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 648.503127] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.503194] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.503356] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.528877] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.529160] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.529370] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.737030] nova-conductor[52626]: ERROR nova.scheduler.utils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance a0755f79-7df4-4660-92e6-5dd80af94aaa was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 655.741691] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Rescheduling: True {{(pid=52626) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 655.741691] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a0755f79-7df4-4660-92e6-5dd80af94aaa.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a0755f79-7df4-4660-92e6-5dd80af94aaa. [ 655.741691] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a0755f79-7df4-4660-92e6-5dd80af94aaa. [ 655.814755] nova-conductor[52626]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] deallocate_for_instance() {{(pid=52626) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 655.938476] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 655.952811] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.953089] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.953861] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.998224] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.998224] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.998224] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.998224] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.998224] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.998478] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.006805] nova-conductor[52626]: DEBUG nova.quota [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Getting quotas for project 72d50358bf2c41d5a556afb101074e8e. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 656.009374] nova-conductor[52626]: DEBUG nova.quota [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Getting quotas for user 324274af4ca54998ab7056451e9a0ace and project 72d50358bf2c41d5a556afb101074e8e. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 656.018037] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 656.018583] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.018805] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.018980] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.022066] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 656.022735] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.022952] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.023143] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.033243] nova-conductor[52626]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Instance cache missing network info. {{(pid=52626) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 656.039082] nova-conductor[52626]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Updating instance_info_cache with network_info: [] {{(pid=52626) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 656.043469] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.043469] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.043574] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.181227] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 657.198053] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.198053] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.198053] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.238292] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.238292] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.238292] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.238292] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.238292] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.238292] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.249804] nova-conductor[52625]: DEBUG nova.quota [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Getting quotas for project 79bbd06fa18e4fccbed1f44e4a30d562. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 657.251607] nova-conductor[52625]: DEBUG nova.quota [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Getting quotas for user 42a62c78e5524d088c47092ef01b885e and project 79bbd06fa18e4fccbed1f44e4a30d562. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 657.257510] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] [instance: 71244679-78d6-4d49-b4b5-ef96fd313ae8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 657.258223] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.258581] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.258890] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.261767] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] [instance: 71244679-78d6-4d49-b4b5-ef96fd313ae8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 657.264321] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.264321] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.264321] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.280973] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.281291] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.281553] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.207713] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 664.226644] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.227027] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.227646] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.295159] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.295159] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.295159] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.295362] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.295605] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.296167] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.311753] nova-conductor[52625]: DEBUG nova.quota [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Getting quotas for project c208886723e4446b8116617285cea4fe. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 664.316072] nova-conductor[52625]: DEBUG nova.quota [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Getting quotas for user 9bec6140223449a0b7aa69fb17baaa02 and project c208886723e4446b8116617285cea4fe. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 664.327142] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] [instance: fb825c5f-bd66-40aa-8027-cb425f3b9b96] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 664.327142] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.327261] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.327968] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.331541] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] [instance: fb825c5f-bd66-40aa-8027-cb425f3b9b96] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 664.332597] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.332930] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.333251] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.349968] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.350311] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.350606] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.828856] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 665.843152] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.843900] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.843900] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.880021] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.880021] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.880021] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.880021] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.880021] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.880021] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.890482] nova-conductor[52626]: DEBUG nova.quota [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Getting quotas for project 5b2c68e89b4547f89e2e42fdc728a9b6. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 665.892124] nova-conductor[52626]: DEBUG nova.quota [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Getting quotas for user 45c0c1b71616412d8daa198e4557efd9 and project 5b2c68e89b4547f89e2e42fdc728a9b6. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 665.898855] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: c76409ad-b0aa-4da6-ac83-58f617ec2588] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 665.899538] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.899608] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.899772] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.910213] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: c76409ad-b0aa-4da6-ac83-58f617ec2588] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 665.910213] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.910369] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.910490] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.935305] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.935305] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.935305] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.936037] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.936330] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.936498] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.082025] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 668.094677] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.094787] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.095015] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.126396] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.126636] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.127038] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.127192] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.127379] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.127611] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.136649] nova-conductor[52625]: DEBUG nova.quota [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Getting quotas for project 2844c29f0996471cb0ac2b4ebec40b27. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 668.138782] nova-conductor[52625]: DEBUG nova.quota [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Getting quotas for user ff5aa8402b674625a61ad6edf0983a42 and project 2844c29f0996471cb0ac2b4ebec40b27. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 668.144161] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] [instance: 63823a4b-97e0-48f9-9fb9-7c4fe3858343] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 668.144723] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.144977] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.145186] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.148078] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] [instance: 63823a4b-97e0-48f9-9fb9-7c4fe3858343] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 668.148709] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.148911] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.149114] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.161872] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.162106] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.162284] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.957412] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 673.974027] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.974027] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.974027] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.017113] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.017390] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.017631] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.017934] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.018143] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.018315] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.027661] nova-conductor[52626]: DEBUG nova.quota [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Getting quotas for project 027d599d310b4abf9ce371b09bb3253b. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 674.032639] nova-conductor[52626]: DEBUG nova.quota [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Getting quotas for user f6f1fc08157841f3bdd66c7e1bc5afa8 and project 027d599d310b4abf9ce371b09bb3253b. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 674.043045] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: 070d142d-6a47-49bc-a061-3101da79447a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 674.043045] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.043045] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.043045] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.045056] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: 070d142d-6a47-49bc-a061-3101da79447a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 674.045880] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.046225] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.046507] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.062725] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.062956] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.063144] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.182407] nova-conductor[52625]: ERROR nova.scheduler.utils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 705.183155] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Rescheduling: True {{(pid=52625) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 705.183399] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8. [ 705.183741] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8. [ 705.210071] nova-conductor[52625]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] deallocate_for_instance() {{(pid=52625) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 705.341161] nova-conductor[52625]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Instance cache missing network info. {{(pid=52625) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.349926] nova-conductor[52625]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Updating instance_info_cache with network_info: [] {{(pid=52625) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 707.132405] nova-conductor[52625]: Traceback (most recent call last): [ 707.132405] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 707.132405] nova-conductor[52625]: return func(*args, **kwargs) [ 707.132405] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 707.132405] nova-conductor[52625]: selections = self._select_destinations( [ 707.132405] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 707.132405] nova-conductor[52625]: selections = self._schedule( [ 707.132405] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 707.132405] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 707.132405] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 707.132405] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 707.132405] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.132405] nova-conductor[52625]: ERROR nova.conductor.manager [ 707.139339] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.139641] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.139995] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.185744] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: c4b552ec-96eb-4657-b351-52e47ba9cf9d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 707.186460] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.186683] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.186863] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.189596] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 707.189596] nova-conductor[52625]: Traceback (most recent call last): [ 707.189596] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 707.189596] nova-conductor[52625]: return func(*args, **kwargs) [ 707.189596] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 707.189596] nova-conductor[52625]: selections = self._select_destinations( [ 707.189596] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 707.189596] nova-conductor[52625]: selections = self._schedule( [ 707.189596] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 707.189596] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 707.189596] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 707.189596] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 707.189596] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 707.189596] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 707.190278] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-ad844fcd-4088-4c03-971e-f8fbc952e2cc tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: c4b552ec-96eb-4657-b351-52e47ba9cf9d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 714.214033] nova-conductor[52625]: Traceback (most recent call last): [ 714.214033] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 714.214033] nova-conductor[52625]: return func(*args, **kwargs) [ 714.214033] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 714.214033] nova-conductor[52625]: selections = self._select_destinations( [ 714.214033] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 714.214033] nova-conductor[52625]: selections = self._schedule( [ 714.214033] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 714.214033] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 714.214033] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 714.214033] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 714.214033] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.214033] nova-conductor[52625]: ERROR nova.conductor.manager [ 714.227620] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.227935] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.228123] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.272645] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] [instance: 1cceafd2-062d-4c6d-8fef-e6264eb329a2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 714.273395] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.273618] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.273792] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.277228] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 714.277228] nova-conductor[52625]: Traceback (most recent call last): [ 714.277228] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 714.277228] nova-conductor[52625]: return func(*args, **kwargs) [ 714.277228] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 714.277228] nova-conductor[52625]: selections = self._select_destinations( [ 714.277228] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 714.277228] nova-conductor[52625]: selections = self._schedule( [ 714.277228] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 714.277228] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 714.277228] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 714.277228] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 714.277228] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 714.277228] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 714.277784] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-af71aa06-7cf4-46d3-879e-71aa9a8ee2d4 tempest-ServersTestFqdnHostnames-333067657 tempest-ServersTestFqdnHostnames-333067657-project-member] [instance: 1cceafd2-062d-4c6d-8fef-e6264eb329a2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 720.754447] nova-conductor[52626]: Traceback (most recent call last): [ 720.754447] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 720.754447] nova-conductor[52626]: return func(*args, **kwargs) [ 720.754447] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 720.754447] nova-conductor[52626]: selections = self._select_destinations( [ 720.754447] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 720.754447] nova-conductor[52626]: selections = self._schedule( [ 720.754447] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 720.754447] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 720.754447] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 720.754447] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 720.754447] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.754447] nova-conductor[52626]: ERROR nova.conductor.manager [ 720.765020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.765020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.765020] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.816927] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] [instance: 7fa2dad0-ce34-41af-9c65-d084a503be37] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 720.817669] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.817903] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.818092] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.821842] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 720.821842] nova-conductor[52626]: Traceback (most recent call last): [ 720.821842] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 720.821842] nova-conductor[52626]: return func(*args, **kwargs) [ 720.821842] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 720.821842] nova-conductor[52626]: selections = self._select_destinations( [ 720.821842] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 720.821842] nova-conductor[52626]: selections = self._schedule( [ 720.821842] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 720.821842] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 720.821842] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 720.821842] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 720.821842] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 720.821842] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 720.822826] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-9f5d36c2-446c-4a98-b988-a0a34edf8da0 tempest-InstanceActionsNegativeTestJSON-771215476 tempest-InstanceActionsNegativeTestJSON-771215476-project-member] [instance: 7fa2dad0-ce34-41af-9c65-d084a503be37] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 729.723723] nova-conductor[52625]: Traceback (most recent call last): [ 729.723723] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 729.723723] nova-conductor[52625]: return func(*args, **kwargs) [ 729.723723] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 729.723723] nova-conductor[52625]: selections = self._select_destinations( [ 729.723723] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 729.723723] nova-conductor[52625]: selections = self._schedule( [ 729.723723] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 729.723723] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 729.723723] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 729.723723] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 729.723723] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.723723] nova-conductor[52625]: ERROR nova.conductor.manager [ 729.730710] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.730950] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.731152] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.773148] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] [instance: 33c907fa-7fb6-48b6-8601-099ffcbbec28] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 729.773867] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.774110] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.774296] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.779012] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 729.779012] nova-conductor[52625]: Traceback (most recent call last): [ 729.779012] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 729.779012] nova-conductor[52625]: return func(*args, **kwargs) [ 729.779012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 729.779012] nova-conductor[52625]: selections = self._select_destinations( [ 729.779012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 729.779012] nova-conductor[52625]: selections = self._schedule( [ 729.779012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 729.779012] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 729.779012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 729.779012] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 729.779012] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 729.779012] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 729.779544] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-8862a932-8261-41c3-be79-a01a074693e1 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] [instance: 33c907fa-7fb6-48b6-8601-099ffcbbec28] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 733.174441] nova-conductor[52626]: Traceback (most recent call last): [ 733.174441] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 733.174441] nova-conductor[52626]: return func(*args, **kwargs) [ 733.174441] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 733.174441] nova-conductor[52626]: selections = self._select_destinations( [ 733.174441] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 733.174441] nova-conductor[52626]: selections = self._schedule( [ 733.174441] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 733.174441] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 733.174441] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 733.174441] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 733.174441] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.174441] nova-conductor[52626]: ERROR nova.conductor.manager [ 733.183236] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.183476] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.183651] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.226554] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] [instance: 47496b49-dff5-4422-b77a-6fb61a1022ef] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 733.227280] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.227485] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.227662] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.230509] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 733.230509] nova-conductor[52626]: Traceback (most recent call last): [ 733.230509] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 733.230509] nova-conductor[52626]: return func(*args, **kwargs) [ 733.230509] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 733.230509] nova-conductor[52626]: selections = self._select_destinations( [ 733.230509] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 733.230509] nova-conductor[52626]: selections = self._schedule( [ 733.230509] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 733.230509] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 733.230509] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 733.230509] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 733.230509] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 733.230509] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 733.231011] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-f1a3864b-d7b2-4b37-93b2-23d6a113da09 tempest-AttachVolumeTestJSON-243494140 tempest-AttachVolumeTestJSON-243494140-project-member] [instance: 47496b49-dff5-4422-b77a-6fb61a1022ef] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 753.056362] nova-conductor[52626]: ERROR nova.scheduler.utils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance b5636e10-af08-49d3-a9b2-8122521a9e2c was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 753.057313] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Rescheduling: True {{(pid=52626) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 753.057313] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b5636e10-af08-49d3-a9b2-8122521a9e2c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b5636e10-af08-49d3-a9b2-8122521a9e2c. [ 753.057461] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b5636e10-af08-49d3-a9b2-8122521a9e2c. [ 753.087582] nova-conductor[52626]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] deallocate_for_instance() {{(pid=52626) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.105714] nova-conductor[52626]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Instance cache missing network info. {{(pid=52626) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.108990] nova-conductor[52626]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Updating instance_info_cache with network_info: [] {{(pid=52626) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.351145] nova-conductor[52625]: ERROR nova.scheduler.utils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 72caf1e5-e894-4581-a95d-21dda85e11b0 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 801.351816] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Rescheduling: True {{(pid=52625) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 801.353161] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72caf1e5-e894-4581-a95d-21dda85e11b0.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72caf1e5-e894-4581-a95d-21dda85e11b0. [ 801.353161] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72caf1e5-e894-4581-a95d-21dda85e11b0. [ 801.383049] nova-conductor[52625]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] deallocate_for_instance() {{(pid=52625) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 801.403475] nova-conductor[52625]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance cache missing network info. {{(pid=52625) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 801.406967] nova-conductor[52625]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Updating instance_info_cache with network_info: [] {{(pid=52625) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 808.740767] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 808.757240] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.757485] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.757662] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.794511] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.794511] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.794511] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.794899] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.795094] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.795261] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.806727] nova-conductor[52626]: DEBUG nova.quota [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Getting quotas for project e15efea8f51244ebab9a72c9fcd83456. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 808.810538] nova-conductor[52626]: DEBUG nova.quota [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Getting quotas for user 6b2c42f3195e46788132fb07d6b771ae and project e15efea8f51244ebab9a72c9fcd83456. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 808.816150] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 808.816601] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.816806] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.817188] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.821315] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 808.821315] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.821315] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.821315] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.833317] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.833535] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.833712] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 826.024123] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Targeting cell e3885d83-7df6-4250-a13b-6a1c0495dd3b(cell1) for conductor method rebuild_instance {{(pid=52625) wrapper /opt/stack/nova/nova/conductor/manager.py:95}} [ 826.024379] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 826.024606] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.024727] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 826.038139] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] No migration record for the rebuild/evacuate request. {{(pid=52625) rebuild_instance /opt/stack/nova/nova/conductor/manager.py:1227}} [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 836.891474] nova-conductor[52625]: Traceback (most recent call last): [ 836.891474] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 836.891474] nova-conductor[52625]: return func(*args, **kwargs) [ 836.891474] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 836.891474] nova-conductor[52625]: selections = self._select_destinations( [ 836.891474] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 836.891474] nova-conductor[52625]: selections = self._schedule( [ 836.891474] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 836.891474] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 836.891474] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 836.891474] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 836.891474] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.891474] nova-conductor[52625]: ERROR nova.conductor.manager [ 836.898593] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.898860] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.899074] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.964796] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: c96e9b8d-ccc8-41c5-a2c4-67dff6ed4425] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 836.965577] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.965892] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.966051] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.971475] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 836.971475] nova-conductor[52625]: Traceback (most recent call last): [ 836.971475] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 836.971475] nova-conductor[52625]: return func(*args, **kwargs) [ 836.971475] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 836.971475] nova-conductor[52625]: selections = self._select_destinations( [ 836.971475] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 836.971475] nova-conductor[52625]: selections = self._schedule( [ 836.971475] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 836.971475] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 836.971475] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 836.971475] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 836.971475] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 836.971475] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 836.971475] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a583adcb-f72d-466c-b3af-363e192b0d9d tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: c96e9b8d-ccc8-41c5-a2c4-67dff6ed4425] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 838.286120] nova-conductor[52626]: Traceback (most recent call last): [ 838.286120] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 838.286120] nova-conductor[52626]: return func(*args, **kwargs) [ 838.286120] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 838.286120] nova-conductor[52626]: selections = self._select_destinations( [ 838.286120] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 838.286120] nova-conductor[52626]: selections = self._schedule( [ 838.286120] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 838.286120] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 838.286120] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 838.286120] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 838.286120] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.286120] nova-conductor[52626]: ERROR nova.conductor.manager [ 838.292895] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 838.293223] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 838.293437] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 838.341657] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: fa09988a-4f54-468b-8663-c46b06caa857] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 838.342437] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 838.342437] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 838.342437] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 838.345315] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 838.345315] nova-conductor[52626]: Traceback (most recent call last): [ 838.345315] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 838.345315] nova-conductor[52626]: return func(*args, **kwargs) [ 838.345315] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 838.345315] nova-conductor[52626]: selections = self._select_destinations( [ 838.345315] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 838.345315] nova-conductor[52626]: selections = self._schedule( [ 838.345315] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 838.345315] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 838.345315] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 838.345315] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 838.345315] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 838.345315] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 838.345974] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-54ebd78b-b9d0-4e00-94b2-8306eecb60d3 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: fa09988a-4f54-468b-8663-c46b06caa857] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 838.999892] nova-conductor[52625]: Traceback (most recent call last): [ 838.999892] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 838.999892] nova-conductor[52625]: return func(*args, **kwargs) [ 838.999892] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 838.999892] nova-conductor[52625]: selections = self._select_destinations( [ 838.999892] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 838.999892] nova-conductor[52625]: selections = self._schedule( [ 838.999892] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 838.999892] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 838.999892] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 838.999892] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 838.999892] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 838.999892] nova-conductor[52625]: ERROR nova.conductor.manager [ 839.007887] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.008153] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.008337] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.058241] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] [instance: c1b2d641-b069-4e84-9b20-f0c550d1c9b0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 839.059069] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.059353] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.059575] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.063121] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 839.063121] nova-conductor[52625]: Traceback (most recent call last): [ 839.063121] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 839.063121] nova-conductor[52625]: return func(*args, **kwargs) [ 839.063121] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 839.063121] nova-conductor[52625]: selections = self._select_destinations( [ 839.063121] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 839.063121] nova-conductor[52625]: selections = self._schedule( [ 839.063121] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 839.063121] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 839.063121] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 839.063121] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 839.063121] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 839.063121] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 839.063927] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-9048f370-cc16-4014-8804-e18f1db126fb tempest-ServerTagsTestJSON-545755618 tempest-ServerTagsTestJSON-545755618-project-member] [instance: c1b2d641-b069-4e84-9b20-f0c550d1c9b0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 839.989250] nova-conductor[52626]: Traceback (most recent call last): [ 839.989250] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 839.989250] nova-conductor[52626]: return func(*args, **kwargs) [ 839.989250] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 839.989250] nova-conductor[52626]: selections = self._select_destinations( [ 839.989250] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 839.989250] nova-conductor[52626]: selections = self._schedule( [ 839.989250] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 839.989250] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 839.989250] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 839.989250] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 839.989250] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 839.989250] nova-conductor[52626]: ERROR nova.conductor.manager [ 840.001013] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 840.001013] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 840.001013] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 840.061199] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 44ac4b5d-a12d-4163-82ef-91bfd4955a6a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 840.061483] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 840.061564] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 840.061744] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 840.069454] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 840.069454] nova-conductor[52626]: Traceback (most recent call last): [ 840.069454] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 840.069454] nova-conductor[52626]: return func(*args, **kwargs) [ 840.069454] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 840.069454] nova-conductor[52626]: selections = self._select_destinations( [ 840.069454] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 840.069454] nova-conductor[52626]: selections = self._schedule( [ 840.069454] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 840.069454] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 840.069454] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 840.069454] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 840.069454] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 840.069454] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 840.070067] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-96845e12-cada-4ac2-978b-672fca2e1ed7 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 44ac4b5d-a12d-4163-82ef-91bfd4955a6a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 840.712052] nova-conductor[52625]: Traceback (most recent call last): [ 840.712052] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 840.712052] nova-conductor[52625]: return func(*args, **kwargs) [ 840.712052] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 840.712052] nova-conductor[52625]: selections = self._select_destinations( [ 840.712052] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 840.712052] nova-conductor[52625]: selections = self._schedule( [ 840.712052] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 840.712052] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 840.712052] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 840.712052] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 840.712052] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.712052] nova-conductor[52625]: ERROR nova.conductor.manager [ 840.723814] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 840.724069] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 840.724260] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 840.779831] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 1d1a429a-cdb9-4185-b658-2101bab646f2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 840.781566] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 840.781566] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 840.781566] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 840.784236] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 840.784236] nova-conductor[52625]: Traceback (most recent call last): [ 840.784236] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 840.784236] nova-conductor[52625]: return func(*args, **kwargs) [ 840.784236] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 840.784236] nova-conductor[52625]: selections = self._select_destinations( [ 840.784236] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 840.784236] nova-conductor[52625]: selections = self._schedule( [ 840.784236] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 840.784236] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 840.784236] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 840.784236] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 840.784236] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 840.784236] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 840.785553] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-02097024-71dc-4bc9-a75e-d35bc42fa2b1 tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 1d1a429a-cdb9-4185-b658-2101bab646f2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 842.428539] nova-conductor[52626]: Traceback (most recent call last): [ 842.428539] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 842.428539] nova-conductor[52626]: return func(*args, **kwargs) [ 842.428539] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 842.428539] nova-conductor[52626]: selections = self._select_destinations( [ 842.428539] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 842.428539] nova-conductor[52626]: selections = self._schedule( [ 842.428539] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 842.428539] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 842.428539] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 842.428539] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 842.428539] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.428539] nova-conductor[52626]: ERROR nova.conductor.manager [ 842.433702] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 842.433942] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 842.434133] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 842.483372] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: a79aa5a5-cda6-4eec-9d9a-ecef87384ff6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 842.487923] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 842.487923] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 842.487923] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 842.489861] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 842.489861] nova-conductor[52626]: Traceback (most recent call last): [ 842.489861] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 842.489861] nova-conductor[52626]: return func(*args, **kwargs) [ 842.489861] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 842.489861] nova-conductor[52626]: selections = self._select_destinations( [ 842.489861] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 842.489861] nova-conductor[52626]: selections = self._schedule( [ 842.489861] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 842.489861] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 842.489861] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 842.489861] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 842.489861] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 842.489861] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 842.490407] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-14720d91-4d21-421c-a07f-f52744e1793c tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: a79aa5a5-cda6-4eec-9d9a-ecef87384ff6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 843.363902] nova-conductor[52625]: Traceback (most recent call last): [ 843.363902] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 843.363902] nova-conductor[52625]: return func(*args, **kwargs) [ 843.363902] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 843.363902] nova-conductor[52625]: selections = self._select_destinations( [ 843.363902] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 843.363902] nova-conductor[52625]: selections = self._schedule( [ 843.363902] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 843.363902] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 843.363902] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 843.363902] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 843.363902] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.363902] nova-conductor[52625]: ERROR nova.conductor.manager [ 843.370268] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 843.370490] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 843.370709] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 843.412449] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 615b7b89-daa9-49fc-bebb-fbb37e834ac5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 843.412449] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 843.412613] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 843.412710] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 843.415453] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 843.415453] nova-conductor[52625]: Traceback (most recent call last): [ 843.415453] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 843.415453] nova-conductor[52625]: return func(*args, **kwargs) [ 843.415453] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 843.415453] nova-conductor[52625]: selections = self._select_destinations( [ 843.415453] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 843.415453] nova-conductor[52625]: selections = self._schedule( [ 843.415453] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 843.415453] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 843.415453] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 843.415453] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 843.415453] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 843.415453] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 843.416069] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-75264d09-353b-4d8e-aefa-00f2362117eb tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 615b7b89-daa9-49fc-bebb-fbb37e834ac5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 844.392446] nova-conductor[52626]: Traceback (most recent call last): [ 844.392446] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 844.392446] nova-conductor[52626]: return func(*args, **kwargs) [ 844.392446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 844.392446] nova-conductor[52626]: selections = self._select_destinations( [ 844.392446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 844.392446] nova-conductor[52626]: selections = self._schedule( [ 844.392446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 844.392446] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 844.392446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 844.392446] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 844.392446] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.392446] nova-conductor[52626]: ERROR nova.conductor.manager [ 844.398099] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 844.398328] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 844.398499] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 844.453450] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: c8ee81ee-0f43-4fa0-810d-40b876cc3e8f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 844.454271] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 844.454415] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 844.454585] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 844.457718] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 844.457718] nova-conductor[52626]: Traceback (most recent call last): [ 844.457718] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 844.457718] nova-conductor[52626]: return func(*args, **kwargs) [ 844.457718] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 844.457718] nova-conductor[52626]: selections = self._select_destinations( [ 844.457718] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 844.457718] nova-conductor[52626]: selections = self._schedule( [ 844.457718] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 844.457718] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 844.457718] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 844.457718] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 844.457718] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 844.457718] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 844.458281] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-08c8cad3-081a-404c-b73e-fd1b9e12daf8 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: c8ee81ee-0f43-4fa0-810d-40b876cc3e8f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 848.496359] nova-conductor[52626]: ERROR nova.scheduler.utils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 848.497293] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Rescheduling: True {{(pid=52626) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 848.497293] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec. [ 848.497466] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec. [ 848.527905] nova-conductor[52626]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] deallocate_for_instance() {{(pid=52626) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 848.547211] nova-conductor[52625]: DEBUG nova.db.main.api [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Created instance_extra for 6c94c59c-44ab-4cb9-8480-18e8a424993b {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 848.547768] nova-conductor[52626]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance cache missing network info. {{(pid=52626) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 848.552377] nova-conductor[52626]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Updating instance_info_cache with network_info: [] {{(pid=52626) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 855.990044] nova-conductor[52625]: Traceback (most recent call last): [ 855.990044] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 855.990044] nova-conductor[52625]: return func(*args, **kwargs) [ 855.990044] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 855.990044] nova-conductor[52625]: selections = self._select_destinations( [ 855.990044] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 855.990044] nova-conductor[52625]: selections = self._schedule( [ 855.990044] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 855.990044] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 855.990044] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 855.990044] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 855.990044] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.990044] nova-conductor[52625]: ERROR nova.conductor.manager [ 855.999496] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 855.999496] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 855.999496] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 856.065582] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] [instance: bed42066-939d-4f0c-b387-9985fe66d054] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 856.065782] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 856.065845] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.066027] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 856.073758] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 856.073758] nova-conductor[52625]: Traceback (most recent call last): [ 856.073758] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 856.073758] nova-conductor[52625]: return func(*args, **kwargs) [ 856.073758] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 856.073758] nova-conductor[52625]: selections = self._select_destinations( [ 856.073758] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 856.073758] nova-conductor[52625]: selections = self._schedule( [ 856.073758] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 856.073758] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 856.073758] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 856.073758] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 856.073758] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 856.073758] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 856.074716] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-964bc578-1171-4487-a145-f8526e0db082 tempest-ServerActionsTestOtherB-905591724 tempest-ServerActionsTestOtherB-905591724-project-member] [instance: bed42066-939d-4f0c-b387-9985fe66d054] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 863.411496] nova-conductor[52626]: Traceback (most recent call last): [ 863.411496] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 863.411496] nova-conductor[52626]: return func(*args, **kwargs) [ 863.411496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 863.411496] nova-conductor[52626]: selections = self._select_destinations( [ 863.411496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 863.411496] nova-conductor[52626]: selections = self._schedule( [ 863.411496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 863.411496] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 863.411496] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 863.411496] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 863.411496] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.411496] nova-conductor[52626]: ERROR nova.conductor.manager [ 863.419430] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.419772] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.419993] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.468763] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 185cd01d-b2d4-4b5d-ad6a-9ab943ce76b1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 863.469588] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.469856] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.470078] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.472901] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 863.472901] nova-conductor[52626]: Traceback (most recent call last): [ 863.472901] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 863.472901] nova-conductor[52626]: return func(*args, **kwargs) [ 863.472901] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 863.472901] nova-conductor[52626]: selections = self._select_destinations( [ 863.472901] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 863.472901] nova-conductor[52626]: selections = self._schedule( [ 863.472901] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 863.472901] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 863.472901] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 863.472901] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 863.472901] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 863.472901] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 863.473615] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-5a1ad026-4d62-45bf-86d4-88562e576460 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 185cd01d-b2d4-4b5d-ad6a-9ab943ce76b1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 865.623973] nova-conductor[52625]: Traceback (most recent call last): [ 865.623973] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 865.623973] nova-conductor[52625]: return func(*args, **kwargs) [ 865.623973] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 865.623973] nova-conductor[52625]: selections = self._select_destinations( [ 865.623973] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 865.623973] nova-conductor[52625]: selections = self._schedule( [ 865.623973] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 865.623973] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 865.623973] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 865.623973] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 865.623973] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.623973] nova-conductor[52625]: ERROR nova.conductor.manager [ 865.631603] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 865.631603] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 865.631603] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 865.682862] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: ae17d11e-5e26-465a-b4ad-7605df308f99] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 865.683635] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 865.683857] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 865.684046] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 865.690144] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 865.690144] nova-conductor[52625]: Traceback (most recent call last): [ 865.690144] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 865.690144] nova-conductor[52625]: return func(*args, **kwargs) [ 865.690144] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 865.690144] nova-conductor[52625]: selections = self._select_destinations( [ 865.690144] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 865.690144] nova-conductor[52625]: selections = self._schedule( [ 865.690144] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 865.690144] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 865.690144] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 865.690144] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 865.690144] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 865.690144] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 865.690144] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: ae17d11e-5e26-465a-b4ad-7605df308f99] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 865.718453] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 865.718769] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 865.718858] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-0c5a3b63-2f7b-47fb-ad49-e587e966ac04 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 867.206967] nova-conductor[52625]: Traceback (most recent call last): [ 867.206967] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 867.206967] nova-conductor[52625]: return func(*args, **kwargs) [ 867.206967] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 867.206967] nova-conductor[52625]: selections = self._select_destinations( [ 867.206967] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 867.206967] nova-conductor[52625]: selections = self._schedule( [ 867.206967] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 867.206967] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 867.206967] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 867.206967] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 867.206967] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.206967] nova-conductor[52625]: ERROR nova.conductor.manager [ 867.214577] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 867.214796] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 867.214978] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 867.279167] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 3ff637b5-bfcf-45e3-ae38-260c6a7cd6c2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 867.279922] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 867.280159] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 867.280333] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 867.283402] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 867.283402] nova-conductor[52625]: Traceback (most recent call last): [ 867.283402] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 867.283402] nova-conductor[52625]: return func(*args, **kwargs) [ 867.283402] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 867.283402] nova-conductor[52625]: selections = self._select_destinations( [ 867.283402] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 867.283402] nova-conductor[52625]: selections = self._schedule( [ 867.283402] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 867.283402] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 867.283402] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 867.283402] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 867.283402] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 867.283402] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 867.284066] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-7794539f-e268-4bcc-b7c9-2cb44df4c9b7 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 3ff637b5-bfcf-45e3-ae38-260c6a7cd6c2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 868.793823] nova-conductor[52626]: Traceback (most recent call last): [ 868.793823] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 868.793823] nova-conductor[52626]: return func(*args, **kwargs) [ 868.793823] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 868.793823] nova-conductor[52626]: selections = self._select_destinations( [ 868.793823] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 868.793823] nova-conductor[52626]: selections = self._schedule( [ 868.793823] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 868.793823] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 868.793823] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 868.793823] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 868.793823] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.793823] nova-conductor[52626]: ERROR nova.conductor.manager [ 868.800234] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.800452] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.800622] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.834954] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: a810a7a9-31f2-47a8-9c16-428f92c36303] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 868.835749] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.835957] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.836148] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.840703] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 868.840703] nova-conductor[52626]: Traceback (most recent call last): [ 868.840703] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 868.840703] nova-conductor[52626]: return func(*args, **kwargs) [ 868.840703] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 868.840703] nova-conductor[52626]: selections = self._select_destinations( [ 868.840703] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 868.840703] nova-conductor[52626]: selections = self._schedule( [ 868.840703] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 868.840703] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 868.840703] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 868.840703] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 868.840703] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 868.840703] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 868.841304] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-6c50723e-c89b-491d-bdf2-640d311114fc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: a810a7a9-31f2-47a8-9c16-428f92c36303] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 870.397965] nova-conductor[52625]: Traceback (most recent call last): [ 870.397965] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 870.397965] nova-conductor[52625]: return func(*args, **kwargs) [ 870.397965] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 870.397965] nova-conductor[52625]: selections = self._select_destinations( [ 870.397965] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 870.397965] nova-conductor[52625]: selections = self._schedule( [ 870.397965] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 870.397965] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 870.397965] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 870.397965] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 870.397965] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.397965] nova-conductor[52625]: ERROR nova.conductor.manager [ 870.404854] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 870.405157] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 870.405345] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.439107] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 677590cd-48de-4fc3-be29-358b28ca61d8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 870.439850] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 870.440087] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 870.440310] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 870.445261] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 870.445261] nova-conductor[52625]: Traceback (most recent call last): [ 870.445261] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 870.445261] nova-conductor[52625]: return func(*args, **kwargs) [ 870.445261] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 870.445261] nova-conductor[52625]: selections = self._select_destinations( [ 870.445261] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 870.445261] nova-conductor[52625]: selections = self._schedule( [ 870.445261] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 870.445261] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 870.445261] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 870.445261] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 870.445261] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 870.445261] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 870.445817] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-4691dc3d-13d5-4021-919b-327ad07d1fdc tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 677590cd-48de-4fc3-be29-358b28ca61d8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 872.046056] nova-conductor[52626]: Traceback (most recent call last): [ 872.046056] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 872.046056] nova-conductor[52626]: return func(*args, **kwargs) [ 872.046056] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 872.046056] nova-conductor[52626]: selections = self._select_destinations( [ 872.046056] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 872.046056] nova-conductor[52626]: selections = self._schedule( [ 872.046056] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 872.046056] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 872.046056] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 872.046056] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 872.046056] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.046056] nova-conductor[52626]: ERROR nova.conductor.manager [ 872.052711] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.052941] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.053128] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.096439] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 339f930d-9886-458a-ba0c-427344905b62] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 872.097564] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.097960] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.098157] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.101968] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 872.101968] nova-conductor[52626]: Traceback (most recent call last): [ 872.101968] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 872.101968] nova-conductor[52626]: return func(*args, **kwargs) [ 872.101968] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 872.101968] nova-conductor[52626]: selections = self._select_destinations( [ 872.101968] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 872.101968] nova-conductor[52626]: selections = self._schedule( [ 872.101968] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 872.101968] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 872.101968] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 872.101968] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 872.101968] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 872.101968] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 872.101968] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-01ce036b-8d9d-4f53-9f1c-5631f37e72f5 tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: 339f930d-9886-458a-ba0c-427344905b62] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 873.124382] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 873.138792] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.139048] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.139303] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.198379] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.198379] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.198379] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.198379] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.198379] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.198379] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.205324] nova-conductor[52626]: DEBUG nova.quota [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Getting quotas for project 027d599d310b4abf9ce371b09bb3253b. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 873.208536] nova-conductor[52626]: DEBUG nova.quota [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Getting quotas for user f6f1fc08157841f3bdd66c7e1bc5afa8 and project 027d599d310b4abf9ce371b09bb3253b. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 873.215478] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 873.215478] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.215478] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.215478] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.218663] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 873.218873] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.218917] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.219261] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.233632] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.233632] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.233632] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.332176] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 878.343748] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.343977] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.344172] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.371922] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.372186] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.372363] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.372721] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.372907] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.373086] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.380998] nova-conductor[52625]: DEBUG nova.quota [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Getting quotas for project 0e5e8feb5e194fddb47dd8364495bb2b. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 878.383318] nova-conductor[52625]: DEBUG nova.quota [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Getting quotas for user 3ad0ae5480f1418283d649af7e0e3810 and project 0e5e8feb5e194fddb47dd8364495bb2b. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 878.389027] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 878.389328] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.389547] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.389719] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.392490] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 878.393136] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.393339] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.393514] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.405461] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.405673] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.405846] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.353103] nova-conductor[52625]: ERROR nova.scheduler.utils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 898.353842] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Rescheduling: True {{(pid=52625) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 898.354112] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658. [ 898.354750] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658. [ 898.377312] nova-conductor[52625]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] deallocate_for_instance() {{(pid=52625) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 898.397574] nova-conductor[52625]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance cache missing network info. {{(pid=52625) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 898.401606] nova-conductor[52625]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Updating instance_info_cache with network_info: [] {{(pid=52625) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 898.403806] nova-conductor[52626]: DEBUG nova.db.main.api [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Created instance_extra for 71244679-78d6-4d49-b4b5-ef96fd313ae8 {{(pid=52626) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 898.456517] nova-conductor[52626]: DEBUG nova.db.main.api [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Created instance_extra for fb825c5f-bd66-40aa-8027-cb425f3b9b96 {{(pid=52626) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 898.522012] nova-conductor[52626]: DEBUG nova.db.main.api [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Created instance_extra for c76409ad-b0aa-4da6-ac83-58f617ec2588 {{(pid=52626) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 898.581019] nova-conductor[52626]: DEBUG nova.db.main.api [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Created instance_extra for 63823a4b-97e0-48f9-9fb9-7c4fe3858343 {{(pid=52626) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 898.627690] nova-conductor[52625]: DEBUG nova.db.main.api [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Created instance_extra for 070d142d-6a47-49bc-a061-3101da79447a {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 900.135730] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Took 0.10 seconds to select destinations for 1 instance(s). {{(pid=52625) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 900.147151] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.147395] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.147573] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.174265] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.174494] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.174698] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.176050] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.176050] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.176050] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.183479] nova-conductor[52625]: DEBUG nova.quota [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Getting quotas for project ed721a0a42ee43fba6f37868594bffec. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 900.185659] nova-conductor[52625]: DEBUG nova.quota [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Getting quotas for user 4930bb1ad0cc4376a388847b3238dded and project ed721a0a42ee43fba6f37868594bffec. Resources: {'cores', 'instances', 'ram'} {{(pid=52625) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 900.191028] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52625) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 900.191451] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.191681] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.191856] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.194598] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 900.195225] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.195422] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.195593] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.207261] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.207491] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.207635] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.969536] nova-conductor[52625]: ERROR nova.scheduler.utils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 945.970269] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Rescheduling: True {{(pid=52625) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 945.970533] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8bc7299c-35d4-4e9f-a243-2834fbadd987.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8bc7299c-35d4-4e9f-a243-2834fbadd987. [ 945.971095] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8bc7299c-35d4-4e9f-a243-2834fbadd987. [ 946.000444] nova-conductor[52625]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] deallocate_for_instance() {{(pid=52625) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 946.016560] nova-conductor[52625]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance cache missing network info. {{(pid=52625) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 946.020031] nova-conductor[52625]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Updating instance_info_cache with network_info: [] {{(pid=52625) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 948.531827] nova-conductor[52625]: Traceback (most recent call last): [ 948.531827] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 948.531827] nova-conductor[52625]: return func(*args, **kwargs) [ 948.531827] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 948.531827] nova-conductor[52625]: selections = self._select_destinations( [ 948.531827] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 948.531827] nova-conductor[52625]: selections = self._schedule( [ 948.531827] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 948.531827] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 948.531827] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 948.531827] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 948.531827] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.531827] nova-conductor[52625]: ERROR nova.conductor.manager [ 948.544758] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.545013] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.545224] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.601996] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: b68a338b-cdfb-46a5-9a4d-212599ededcd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 948.603717] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.603717] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.603717] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.606290] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 948.606290] nova-conductor[52625]: Traceback (most recent call last): [ 948.606290] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 948.606290] nova-conductor[52625]: return func(*args, **kwargs) [ 948.606290] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 948.606290] nova-conductor[52625]: selections = self._select_destinations( [ 948.606290] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 948.606290] nova-conductor[52625]: selections = self._schedule( [ 948.606290] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 948.606290] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 948.606290] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 948.606290] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 948.606290] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 948.606290] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 948.606809] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-07c5ae0c-2eb7-4271-a59f-50918564f855 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: b68a338b-cdfb-46a5-9a4d-212599ededcd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.535595] nova-conductor[52625]: Traceback (most recent call last): [ 951.535595] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 951.535595] nova-conductor[52625]: return func(*args, **kwargs) [ 951.535595] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 951.535595] nova-conductor[52625]: selections = self._select_destinations( [ 951.535595] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 951.535595] nova-conductor[52625]: selections = self._schedule( [ 951.535595] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 951.535595] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 951.535595] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 951.535595] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 951.535595] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.535595] nova-conductor[52625]: ERROR nova.conductor.manager [ 951.542295] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.542519] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.542692] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.581021] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 46256a59-520e-4485-97ba-8fe1a5532baf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 951.581787] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.582079] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.582307] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.585195] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 951.585195] nova-conductor[52625]: Traceback (most recent call last): [ 951.585195] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 951.585195] nova-conductor[52625]: return func(*args, **kwargs) [ 951.585195] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 951.585195] nova-conductor[52625]: selections = self._select_destinations( [ 951.585195] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 951.585195] nova-conductor[52625]: selections = self._schedule( [ 951.585195] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 951.585195] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 951.585195] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 951.585195] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 951.585195] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 951.585195] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 951.585711] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-fe1a4ed5-9c82-41a1-aa97-bbfb66fd79d9 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 46256a59-520e-4485-97ba-8fe1a5532baf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 958.101249] nova-conductor[52626]: Traceback (most recent call last): [ 958.101249] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 958.101249] nova-conductor[52626]: return func(*args, **kwargs) [ 958.101249] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 958.101249] nova-conductor[52626]: selections = self._select_destinations( [ 958.101249] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 958.101249] nova-conductor[52626]: selections = self._schedule( [ 958.101249] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 958.101249] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 958.101249] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 958.101249] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 958.101249] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.101249] nova-conductor[52626]: ERROR nova.conductor.manager [ 958.111232] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 958.112259] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 958.112915] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 958.166987] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] [instance: baa28177-bbec-463f-87ff-2673e223b0d1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 958.167816] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 958.168147] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 958.168385] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 958.171123] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 958.171123] nova-conductor[52626]: Traceback (most recent call last): [ 958.171123] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 958.171123] nova-conductor[52626]: return func(*args, **kwargs) [ 958.171123] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 958.171123] nova-conductor[52626]: selections = self._select_destinations( [ 958.171123] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 958.171123] nova-conductor[52626]: selections = self._schedule( [ 958.171123] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 958.171123] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 958.171123] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 958.171123] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 958.171123] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 958.171123] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 958.171923] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-5b691621-fdba-4b2e-ba8f-c3c8a4de96e3 tempest-InstanceActionsV221TestJSON-897685954 tempest-InstanceActionsV221TestJSON-897685954-project-member] [instance: baa28177-bbec-463f-87ff-2673e223b0d1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 963.184012] nova-conductor[52625]: Traceback (most recent call last): [ 963.184012] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 963.184012] nova-conductor[52625]: return func(*args, **kwargs) [ 963.184012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 963.184012] nova-conductor[52625]: selections = self._select_destinations( [ 963.184012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 963.184012] nova-conductor[52625]: selections = self._schedule( [ 963.184012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 963.184012] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 963.184012] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 963.184012] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 963.184012] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.184012] nova-conductor[52625]: ERROR nova.conductor.manager [ 963.192183] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.192415] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.192589] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.237828] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] [instance: 2b6f5a87-2f2f-4696-841b-4a023916374d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 963.238556] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.238766] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.238939] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.241905] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 963.241905] nova-conductor[52625]: Traceback (most recent call last): [ 963.241905] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 963.241905] nova-conductor[52625]: return func(*args, **kwargs) [ 963.241905] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 963.241905] nova-conductor[52625]: selections = self._select_destinations( [ 963.241905] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 963.241905] nova-conductor[52625]: selections = self._schedule( [ 963.241905] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 963.241905] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 963.241905] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 963.241905] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 963.241905] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 963.241905] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 963.242471] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-d3f35ba8-1675-499b-a6e6-caf750923993 tempest-ServerActionsTestOtherA-1381941062 tempest-ServerActionsTestOtherA-1381941062-project-member] [instance: 2b6f5a87-2f2f-4696-841b-4a023916374d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 965.636597] nova-conductor[52626]: Traceback (most recent call last): [ 965.636597] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 965.636597] nova-conductor[52626]: return func(*args, **kwargs) [ 965.636597] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 965.636597] nova-conductor[52626]: selections = self._select_destinations( [ 965.636597] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 965.636597] nova-conductor[52626]: selections = self._schedule( [ 965.636597] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 965.636597] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 965.636597] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 965.636597] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 965.636597] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.636597] nova-conductor[52626]: ERROR nova.conductor.manager [ 965.644543] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 965.644952] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 965.644952] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 965.693277] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] [instance: bef40672-e4d7-4db7-bb3a-79a17789ba02] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 965.696093] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 965.696093] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 965.696093] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 965.698440] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 965.698440] nova-conductor[52626]: Traceback (most recent call last): [ 965.698440] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 965.698440] nova-conductor[52626]: return func(*args, **kwargs) [ 965.698440] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 965.698440] nova-conductor[52626]: selections = self._select_destinations( [ 965.698440] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 965.698440] nova-conductor[52626]: selections = self._schedule( [ 965.698440] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 965.698440] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 965.698440] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 965.698440] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 965.698440] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 965.698440] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 965.699096] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-b452cf0a-ee09-4441-be8c-1debac72c4a5 tempest-ServerAddressesTestJSON-67384702 tempest-ServerAddressesTestJSON-67384702-project-member] [instance: bef40672-e4d7-4db7-bb3a-79a17789ba02] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 971.257780] nova-conductor[52625]: Traceback (most recent call last): [ 971.257780] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 971.257780] nova-conductor[52625]: return func(*args, **kwargs) [ 971.257780] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 971.257780] nova-conductor[52625]: selections = self._select_destinations( [ 971.257780] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 971.257780] nova-conductor[52625]: selections = self._schedule( [ 971.257780] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 971.257780] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 971.257780] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 971.257780] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 971.257780] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.257780] nova-conductor[52625]: ERROR nova.conductor.manager [ 971.270152] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 971.270415] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 971.270594] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 971.324418] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: fac6315e-9a60-4b9a-9441-2a68570f7418] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 971.325387] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 971.325728] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 971.326014] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 971.329981] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 971.329981] nova-conductor[52625]: Traceback (most recent call last): [ 971.329981] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 971.329981] nova-conductor[52625]: return func(*args, **kwargs) [ 971.329981] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 971.329981] nova-conductor[52625]: selections = self._select_destinations( [ 971.329981] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 971.329981] nova-conductor[52625]: selections = self._schedule( [ 971.329981] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 971.329981] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 971.329981] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 971.329981] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 971.329981] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 971.329981] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 971.329981] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-b957d6e7-85ec-49ce-9493-4b6c114f7c75 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: fac6315e-9a60-4b9a-9441-2a68570f7418] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 972.368618] nova-conductor[52626]: Traceback (most recent call last): [ 972.368618] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 972.368618] nova-conductor[52626]: return func(*args, **kwargs) [ 972.368618] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 972.368618] nova-conductor[52626]: selections = self._select_destinations( [ 972.368618] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 972.368618] nova-conductor[52626]: selections = self._schedule( [ 972.368618] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 972.368618] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 972.368618] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 972.368618] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 972.368618] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.368618] nova-conductor[52626]: ERROR nova.conductor.manager [ 972.381187] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 972.381187] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 972.381187] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 972.432588] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] [instance: 232d6615-9e1e-47a7-8ec5-815be90b0e89] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 972.433440] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 972.433660] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 972.433833] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 972.437540] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 972.437540] nova-conductor[52626]: Traceback (most recent call last): [ 972.437540] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 972.437540] nova-conductor[52626]: return func(*args, **kwargs) [ 972.437540] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 972.437540] nova-conductor[52626]: selections = self._select_destinations( [ 972.437540] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 972.437540] nova-conductor[52626]: selections = self._schedule( [ 972.437540] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 972.437540] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 972.437540] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 972.437540] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 972.437540] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 972.437540] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 972.437982] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-1f12d987-55e8-4061-b910-44a7537cab93 tempest-ServerShowV257Test-1481440164 tempest-ServerShowV257Test-1481440164-project-member] [instance: 232d6615-9e1e-47a7-8ec5-815be90b0e89] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.114893] nova-conductor[52625]: Traceback (most recent call last): [ 973.114893] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 973.114893] nova-conductor[52625]: return func(*args, **kwargs) [ 973.114893] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 973.114893] nova-conductor[52625]: selections = self._select_destinations( [ 973.114893] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 973.114893] nova-conductor[52625]: selections = self._schedule( [ 973.114893] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 973.114893] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 973.114893] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 973.114893] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 973.114893] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.114893] nova-conductor[52625]: ERROR nova.conductor.manager [ 973.121855] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 973.122209] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 973.122336] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 973.163915] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: 7ea3668a-a4d0-4feb-93f8-0284b44b4460] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 973.164625] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 973.164910] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 973.165016] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 973.173021] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 973.173021] nova-conductor[52625]: Traceback (most recent call last): [ 973.173021] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 973.173021] nova-conductor[52625]: return func(*args, **kwargs) [ 973.173021] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 973.173021] nova-conductor[52625]: selections = self._select_destinations( [ 973.173021] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 973.173021] nova-conductor[52625]: selections = self._schedule( [ 973.173021] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 973.173021] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 973.173021] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 973.173021] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 973.173021] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 973.173021] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.173021] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-89d6da54-08e4-4ec7-8efa-38d4e0bd4823 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: 7ea3668a-a4d0-4feb-93f8-0284b44b4460] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.202092] nova-conductor[52626]: Traceback (most recent call last): [ 973.202092] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 973.202092] nova-conductor[52626]: return func(*args, **kwargs) [ 973.202092] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 973.202092] nova-conductor[52626]: selections = self._select_destinations( [ 973.202092] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 973.202092] nova-conductor[52626]: selections = self._schedule( [ 973.202092] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 973.202092] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 973.202092] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 973.202092] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 973.202092] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.202092] nova-conductor[52626]: ERROR nova.conductor.manager [ 973.208096] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 973.208372] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 973.208562] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 973.244866] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] [instance: faf8511a-1790-49f5-ab99-53b04afaafe1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 973.245646] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 973.245863] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 973.246047] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 973.248847] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 973.248847] nova-conductor[52626]: Traceback (most recent call last): [ 973.248847] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 973.248847] nova-conductor[52626]: return func(*args, **kwargs) [ 973.248847] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 973.248847] nova-conductor[52626]: selections = self._select_destinations( [ 973.248847] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 973.248847] nova-conductor[52626]: selections = self._schedule( [ 973.248847] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 973.248847] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 973.248847] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 973.248847] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 973.248847] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 973.248847] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 973.249379] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-c48a4944-1992-4f2c-a760-65a179a0f2ed tempest-ServerPasswordTestJSON-944192765 tempest-ServerPasswordTestJSON-944192765-project-member] [instance: faf8511a-1790-49f5-ab99-53b04afaafe1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 974.997718] nova-conductor[52625]: Traceback (most recent call last): [ 974.997718] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 974.997718] nova-conductor[52625]: return func(*args, **kwargs) [ 974.997718] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 974.997718] nova-conductor[52625]: selections = self._select_destinations( [ 974.997718] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 974.997718] nova-conductor[52625]: selections = self._schedule( [ 974.997718] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 974.997718] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 974.997718] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 974.997718] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 974.997718] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 974.997718] nova-conductor[52625]: ERROR nova.conductor.manager [ 975.005505] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 975.005884] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 975.006145] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 975.043360] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: 0df0d707-0432-449d-adb2-4a4f693b900a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 975.044104] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 975.044586] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 975.044802] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 975.047860] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 975.047860] nova-conductor[52625]: Traceback (most recent call last): [ 975.047860] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 975.047860] nova-conductor[52625]: return func(*args, **kwargs) [ 975.047860] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 975.047860] nova-conductor[52625]: selections = self._select_destinations( [ 975.047860] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 975.047860] nova-conductor[52625]: selections = self._schedule( [ 975.047860] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 975.047860] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 975.047860] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 975.047860] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 975.047860] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 975.047860] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 975.048561] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-68fc71e1-61e3-43ec-82b6-584db5f17186 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: 0df0d707-0432-449d-adb2-4a4f693b900a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 976.969849] nova-conductor[52626]: Traceback (most recent call last): [ 976.969849] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 976.969849] nova-conductor[52626]: return func(*args, **kwargs) [ 976.969849] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 976.969849] nova-conductor[52626]: selections = self._select_destinations( [ 976.969849] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 976.969849] nova-conductor[52626]: selections = self._schedule( [ 976.969849] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 976.969849] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 976.969849] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 976.969849] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 976.969849] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.969849] nova-conductor[52626]: ERROR nova.conductor.manager [ 976.976731] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 976.977299] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 976.977686] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.021849] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: d23dea5a-ba35-4483-9fbe-df3ff2a32a24] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 977.022719] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.022835] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.023011] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.025996] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 977.025996] nova-conductor[52626]: Traceback (most recent call last): [ 977.025996] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 977.025996] nova-conductor[52626]: return func(*args, **kwargs) [ 977.025996] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 977.025996] nova-conductor[52626]: selections = self._select_destinations( [ 977.025996] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 977.025996] nova-conductor[52626]: selections = self._schedule( [ 977.025996] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 977.025996] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 977.025996] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 977.025996] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 977.025996] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 977.025996] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 977.026533] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-0d20696c-d452-497e-9d1d-4e865894d211 tempest-ImagesTestJSON-864704258 tempest-ImagesTestJSON-864704258-project-member] [instance: d23dea5a-ba35-4483-9fbe-df3ff2a32a24] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 977.647041] nova-conductor[52625]: Traceback (most recent call last): [ 977.647041] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 977.647041] nova-conductor[52625]: return func(*args, **kwargs) [ 977.647041] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 977.647041] nova-conductor[52625]: selections = self._select_destinations( [ 977.647041] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 977.647041] nova-conductor[52625]: selections = self._schedule( [ 977.647041] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 977.647041] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 977.647041] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 977.647041] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 977.647041] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.647041] nova-conductor[52625]: ERROR nova.conductor.manager [ 977.653408] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.653575] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.653757] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.695931] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] [instance: 2f091453-4ca1-400e-96af-afd0cc7a126a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 977.696621] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.696859] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.697042] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.704357] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 977.704357] nova-conductor[52625]: Traceback (most recent call last): [ 977.704357] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 977.704357] nova-conductor[52625]: return func(*args, **kwargs) [ 977.704357] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 977.704357] nova-conductor[52625]: selections = self._select_destinations( [ 977.704357] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 977.704357] nova-conductor[52625]: selections = self._schedule( [ 977.704357] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 977.704357] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 977.704357] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 977.704357] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 977.704357] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 977.704357] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 977.705037] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-a8ada967-22b7-44cb-8ddc-6566764a6f15 tempest-ServersNegativeTestJSON-1930054237 tempest-ServersNegativeTestJSON-1930054237-project-member] [instance: 2f091453-4ca1-400e-96af-afd0cc7a126a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.233063] nova-conductor[52626]: Traceback (most recent call last): [ 981.233063] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 981.233063] nova-conductor[52626]: return func(*args, **kwargs) [ 981.233063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 981.233063] nova-conductor[52626]: selections = self._select_destinations( [ 981.233063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 981.233063] nova-conductor[52626]: selections = self._schedule( [ 981.233063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 981.233063] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 981.233063] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 981.233063] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 981.233063] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.233063] nova-conductor[52626]: ERROR nova.conductor.manager [ 981.240763] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 981.241028] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 981.241215] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 981.288990] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] [instance: bb1cca0e-be72-4041-b20c-3804665e6c02] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 981.290337] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 981.290675] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 981.290872] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 981.297283] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 981.297283] nova-conductor[52626]: Traceback (most recent call last): [ 981.297283] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 981.297283] nova-conductor[52626]: return func(*args, **kwargs) [ 981.297283] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 981.297283] nova-conductor[52626]: selections = self._select_destinations( [ 981.297283] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 981.297283] nova-conductor[52626]: selections = self._schedule( [ 981.297283] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 981.297283] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 981.297283] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 981.297283] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 981.297283] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 981.297283] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.298424] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-afa431db-df5c-4e56-89d1-a99dbacdcfc9 tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] [instance: bb1cca0e-be72-4041-b20c-3804665e6c02] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.710083] nova-conductor[52625]: Traceback (most recent call last): [ 981.710083] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 981.710083] nova-conductor[52625]: return func(*args, **kwargs) [ 981.710083] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 981.710083] nova-conductor[52625]: selections = self._select_destinations( [ 981.710083] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 981.710083] nova-conductor[52625]: selections = self._schedule( [ 981.710083] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 981.710083] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 981.710083] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 981.710083] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 981.710083] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.710083] nova-conductor[52625]: ERROR nova.conductor.manager [ 981.718641] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 981.718880] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 981.719079] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 981.765846] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] [instance: 690b0c74-8002-42d0-b658-f029233344ba] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 981.766939] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 981.767325] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 981.767627] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 981.772613] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 981.772613] nova-conductor[52625]: Traceback (most recent call last): [ 981.772613] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 981.772613] nova-conductor[52625]: return func(*args, **kwargs) [ 981.772613] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 981.772613] nova-conductor[52625]: selections = self._select_destinations( [ 981.772613] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 981.772613] nova-conductor[52625]: selections = self._schedule( [ 981.772613] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 981.772613] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 981.772613] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 981.772613] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 981.772613] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 981.772613] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 981.773594] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-cf65e873-26f7-4f70-9740-822db4c6981a tempest-ServerRescueNegativeTestJSON-1605692616 tempest-ServerRescueNegativeTestJSON-1605692616-project-member] [instance: 690b0c74-8002-42d0-b658-f029233344ba] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 984.598031] nova-conductor[52626]: Traceback (most recent call last): [ 984.598031] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 984.598031] nova-conductor[52626]: return func(*args, **kwargs) [ 984.598031] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 984.598031] nova-conductor[52626]: selections = self._select_destinations( [ 984.598031] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 984.598031] nova-conductor[52626]: selections = self._schedule( [ 984.598031] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 984.598031] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 984.598031] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 984.598031] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 984.598031] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.598031] nova-conductor[52626]: ERROR nova.conductor.manager [ 984.611076] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 984.611402] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 984.611516] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 984.656154] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] [instance: cb1b41c9-5990-4328-b68c-a2e72a9ded13] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 984.656929] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 984.657172] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 984.657383] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 984.663529] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 984.663529] nova-conductor[52626]: Traceback (most recent call last): [ 984.663529] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 984.663529] nova-conductor[52626]: return func(*args, **kwargs) [ 984.663529] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 984.663529] nova-conductor[52626]: selections = self._select_destinations( [ 984.663529] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 984.663529] nova-conductor[52626]: selections = self._schedule( [ 984.663529] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 984.663529] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 984.663529] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 984.663529] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 984.663529] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 984.663529] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 984.664221] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-867d13a5-6c66-4783-85e5-5bd7e15dbf5f tempest-ServersV294TestFqdnHostnames-1455313213 tempest-ServersV294TestFqdnHostnames-1455313213-project-member] [instance: cb1b41c9-5990-4328-b68c-a2e72a9ded13] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 996.071997] nova-conductor[52626]: ERROR nova.scheduler.utils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance da3eaeea-ce26-40eb-af8b-8857f927e431 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 996.072630] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Rescheduling: True {{(pid=52626) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 996.072866] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance da3eaeea-ce26-40eb-af8b-8857f927e431.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance da3eaeea-ce26-40eb-af8b-8857f927e431. [ 996.073093] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance da3eaeea-ce26-40eb-af8b-8857f927e431. [ 996.096927] nova-conductor[52626]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] deallocate_for_instance() {{(pid=52626) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 996.221123] nova-conductor[52626]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance cache missing network info. {{(pid=52626) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 996.224500] nova-conductor[52626]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Updating instance_info_cache with network_info: [] {{(pid=52626) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.035151] nova-conductor[52626]: ERROR nova.scheduler.utils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 81af879b-3bc3-4aff-a99d-98d3aba73512 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1043.035752] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Rescheduling: True {{(pid=52626) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1043.036012] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 81af879b-3bc3-4aff-a99d-98d3aba73512.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 81af879b-3bc3-4aff-a99d-98d3aba73512. [ 1043.036235] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 81af879b-3bc3-4aff-a99d-98d3aba73512. [ 1043.064820] nova-conductor[52626]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] deallocate_for_instance() {{(pid=52626) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.082402] nova-conductor[52626]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance cache missing network info. {{(pid=52626) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1043.085588] nova-conductor[52626]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Updating instance_info_cache with network_info: [] {{(pid=52626) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1078.134122] nova-conductor[52625]: Traceback (most recent call last): [ 1078.134122] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1078.134122] nova-conductor[52625]: return func(*args, **kwargs) [ 1078.134122] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1078.134122] nova-conductor[52625]: selections = self._select_destinations( [ 1078.134122] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1078.134122] nova-conductor[52625]: selections = self._schedule( [ 1078.134122] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1078.134122] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 1078.134122] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1078.134122] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 1078.134122] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.134122] nova-conductor[52625]: ERROR nova.conductor.manager [ 1078.141391] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1078.141492] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.141754] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.179521] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] [instance: 1c3b213e-2f53-48a7-94e4-ff2a48220994] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1078.180252] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1078.180470] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.180642] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.183432] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1078.183432] nova-conductor[52625]: Traceback (most recent call last): [ 1078.183432] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1078.183432] nova-conductor[52625]: return func(*args, **kwargs) [ 1078.183432] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1078.183432] nova-conductor[52625]: selections = self._select_destinations( [ 1078.183432] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1078.183432] nova-conductor[52625]: selections = self._schedule( [ 1078.183432] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1078.183432] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 1078.183432] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1078.183432] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 1078.183432] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1078.183432] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1078.183952] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-12c6bf62-8f0b-430a-86ff-333280056406 tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] [instance: 1c3b213e-2f53-48a7-94e4-ff2a48220994] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1078.289029] nova-conductor[52626]: Traceback (most recent call last): [ 1078.289029] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1078.289029] nova-conductor[52626]: return func(*args, **kwargs) [ 1078.289029] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1078.289029] nova-conductor[52626]: selections = self._select_destinations( [ 1078.289029] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1078.289029] nova-conductor[52626]: selections = self._schedule( [ 1078.289029] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1078.289029] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 1078.289029] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1078.289029] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 1078.289029] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.289029] nova-conductor[52626]: ERROR nova.conductor.manager [ 1078.295016] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1078.295277] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.295473] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.332816] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] [instance: 1eb10b69-cac1-4046-90d6-c512d365cb0f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1078.333466] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1078.333676] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.333847] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.338446] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1078.338446] nova-conductor[52626]: Traceback (most recent call last): [ 1078.338446] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1078.338446] nova-conductor[52626]: return func(*args, **kwargs) [ 1078.338446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1078.338446] nova-conductor[52626]: selections = self._select_destinations( [ 1078.338446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1078.338446] nova-conductor[52626]: selections = self._schedule( [ 1078.338446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1078.338446] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 1078.338446] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1078.338446] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 1078.338446] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1078.338446] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1078.338995] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-ba2aac4e-cfde-4068-914d-0fc195f354fc tempest-ServerShowV247Test-435529770 tempest-ServerShowV247Test-435529770-project-member] [instance: 1eb10b69-cac1-4046-90d6-c512d365cb0f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1083.490702] nova-conductor[52625]: Traceback (most recent call last): [ 1083.490702] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1083.490702] nova-conductor[52625]: return func(*args, **kwargs) [ 1083.490702] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1083.490702] nova-conductor[52625]: selections = self._select_destinations( [ 1083.490702] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1083.490702] nova-conductor[52625]: selections = self._schedule( [ 1083.490702] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1083.490702] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 1083.490702] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1083.490702] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 1083.490702] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager result = self.transport._send( [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager raise result [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager selections = self._schedule( [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.490702] nova-conductor[52625]: ERROR nova.conductor.manager [ 1083.498576] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1083.498788] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1083.498961] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1083.539650] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] [instance: 84161092-10a5-430f-9124-52ec04005771] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52625) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1083.540351] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1083.540568] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1083.540770] nova-conductor[52625]: DEBUG oslo_concurrency.lockutils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52625) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1083.544700] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1083.544700] nova-conductor[52625]: Traceback (most recent call last): [ 1083.544700] nova-conductor[52625]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1083.544700] nova-conductor[52625]: return func(*args, **kwargs) [ 1083.544700] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1083.544700] nova-conductor[52625]: selections = self._select_destinations( [ 1083.544700] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1083.544700] nova-conductor[52625]: selections = self._schedule( [ 1083.544700] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1083.544700] nova-conductor[52625]: self._ensure_sufficient_hosts( [ 1083.544700] nova-conductor[52625]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1083.544700] nova-conductor[52625]: raise exception.NoValidHost(reason=reason) [ 1083.544700] nova-conductor[52625]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1083.544700] nova-conductor[52625]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1083.545454] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-78615113-e149-4151-9598-1fc9eb0c9a23 tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] [instance: 84161092-10a5-430f-9124-52ec04005771] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1086.248961] nova-conductor[52626]: Traceback (most recent call last): [ 1086.248961] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1086.248961] nova-conductor[52626]: return func(*args, **kwargs) [ 1086.248961] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1086.248961] nova-conductor[52626]: selections = self._select_destinations( [ 1086.248961] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1086.248961] nova-conductor[52626]: selections = self._schedule( [ 1086.248961] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1086.248961] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 1086.248961] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1086.248961] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 1086.248961] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.248961] nova-conductor[52626]: ERROR nova.conductor.manager [ 1086.256602] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1086.256862] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1086.257015] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1086.297383] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] [instance: 197097d6-b1da-4ed9-8e84-de7fcd3ce082] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1086.298115] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1086.298361] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1086.298535] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1086.301276] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1086.301276] nova-conductor[52626]: Traceback (most recent call last): [ 1086.301276] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1086.301276] nova-conductor[52626]: return func(*args, **kwargs) [ 1086.301276] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1086.301276] nova-conductor[52626]: selections = self._select_destinations( [ 1086.301276] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1086.301276] nova-conductor[52626]: selections = self._schedule( [ 1086.301276] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1086.301276] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 1086.301276] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1086.301276] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 1086.301276] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1086.301276] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1086.301782] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-f9f00991-f2b5-4820-a4cb-571e6e0fcb5a tempest-AttachVolumeShelveTestJSON-1817785103 tempest-AttachVolumeShelveTestJSON-1817785103-project-member] [instance: 197097d6-b1da-4ed9-8e84-de7fcd3ce082] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1093.004936] nova-conductor[52626]: ERROR nova.scheduler.utils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 843d4db6-c1fb-4b74-ad3c-779e309a170e was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1093.005810] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Rescheduling: True {{(pid=52626) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1093.006136] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 843d4db6-c1fb-4b74-ad3c-779e309a170e.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 843d4db6-c1fb-4b74-ad3c-779e309a170e. [ 1093.006396] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 843d4db6-c1fb-4b74-ad3c-779e309a170e. [ 1093.028024] nova-conductor[52626]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] deallocate_for_instance() {{(pid=52626) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1093.046183] nova-conductor[52626]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance cache missing network info. {{(pid=52626) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1093.049623] nova-conductor[52626]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Updating instance_info_cache with network_info: [] {{(pid=52626) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1093.249897] nova-conductor[52625]: DEBUG nova.db.main.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Created instance_extra for 029d2099-2e55-4632-81b6-b59d6a20faab {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1093.987745] nova-conductor[52625]: DEBUG nova.db.main.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Created instance_extra for 068814dd-328c-48d1-b514-34eb43b0f2b1 {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1094.709625] nova-conductor[52625]: DEBUG nova.db.main.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Created instance_extra for 500d78f9-ee0c-4620-9936-1a9b4f4fc09a {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1095.488300] nova-conductor[52625]: DEBUG nova.db.main.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Created instance_extra for 57a5dcae-6861-418a-a041-9cd5b7a43982 {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1096.143214] nova-conductor[52626]: DEBUG nova.db.main.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Created instance_extra for 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a {{(pid=52626) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1100.061852] nova-conductor[52626]: Traceback (most recent call last): [ 1100.061852] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1100.061852] nova-conductor[52626]: return func(*args, **kwargs) [ 1100.061852] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1100.061852] nova-conductor[52626]: selections = self._select_destinations( [ 1100.061852] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1100.061852] nova-conductor[52626]: selections = self._schedule( [ 1100.061852] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1100.061852] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 1100.061852] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1100.061852] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 1100.061852] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager result = self.transport._send( [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager raise result [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager Traceback (most recent call last): [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._select_destinations( [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager selections = self._schedule( [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.061852] nova-conductor[52626]: ERROR nova.conductor.manager [ 1100.068643] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1100.068872] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1100.069082] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1100.105043] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] [instance: 78b30fe4-8203-44fa-9812-a833b9b15eac] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1100.105679] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1100.105884] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1100.106070] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1100.108713] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 1100.108713] nova-conductor[52626]: Traceback (most recent call last): [ 1100.108713] nova-conductor[52626]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 1100.108713] nova-conductor[52626]: return func(*args, **kwargs) [ 1100.108713] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 1100.108713] nova-conductor[52626]: selections = self._select_destinations( [ 1100.108713] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 1100.108713] nova-conductor[52626]: selections = self._schedule( [ 1100.108713] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 1100.108713] nova-conductor[52626]: self._ensure_sufficient_hosts( [ 1100.108713] nova-conductor[52626]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 1100.108713] nova-conductor[52626]: raise exception.NoValidHost(reason=reason) [ 1100.108713] nova-conductor[52626]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 1100.108713] nova-conductor[52626]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1100.109235] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-14f3239d-988f-4870-bd6c-9c79201d6a61 tempest-ServerDiagnosticsTest-1756927339 tempest-ServerDiagnosticsTest-1756927339-project-member] [instance: 78b30fe4-8203-44fa-9812-a833b9b15eac] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 1147.985138] nova-conductor[52625]: ERROR nova.scheduler.utils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 67cfe7ba-4590-451b-9e1a-340977b597a4 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1147.985714] nova-conductor[52625]: DEBUG nova.conductor.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Rescheduling: True {{(pid=52625) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1147.985944] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67cfe7ba-4590-451b-9e1a-340977b597a4.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67cfe7ba-4590-451b-9e1a-340977b597a4. [ 1147.986189] nova-conductor[52625]: WARNING nova.scheduler.utils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67cfe7ba-4590-451b-9e1a-340977b597a4. [ 1148.004835] nova-conductor[52625]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] deallocate_for_instance() {{(pid=52625) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1148.048804] nova-conductor[52625]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance cache missing network info. {{(pid=52625) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1148.052230] nova-conductor[52625]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Updating instance_info_cache with network_info: [] {{(pid=52625) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1153.458055] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52626) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 1153.471029] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.471029] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.471029] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.496734] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.497078] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.497321] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.497735] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.497987] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.498330] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.506624] nova-conductor[52626]: DEBUG nova.quota [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Getting quotas for project 358c9351af584a2b96e16a72d45da8da. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 1153.508947] nova-conductor[52626]: DEBUG nova.quota [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Getting quotas for user df01dd84e1384c44a40bc475ee636bbc and project 358c9351af584a2b96e16a72d45da8da. Resources: {'cores', 'instances', 'ram'} {{(pid=52626) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 1153.514764] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52626) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 1153.515080] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.515350] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.515667] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.518438] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ab7fcb5a-745a-4c08-9c04-49b187178f83',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52626) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 1153.519769] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.519769] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.519769] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.531599] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.531882] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.532138] nova-conductor[52626]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52626) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1197.905675] nova-conductor[52626]: DEBUG nova.db.main.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Created instance_extra for e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 {{(pid=52626) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1198.123185] nova-conductor[52625]: DEBUG nova.db.main.api [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Created instance_extra for a45c150e-942b-454a-ab59-aa6b191bfada {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1198.845054] nova-conductor[52625]: DEBUG nova.db.main.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Created instance_extra for 8f4635d8-5789-4402-8ca2-543b4d4dfc76 {{(pid=52625) instance_extra_update_by_uuid /opt/stack/nova/nova/db/main/api.py:2551}} [ 1248.399395] nova-conductor[52626]: ERROR nova.scheduler.utils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance d2ff993d-35d8-479c-bb3e-2c06080896d0 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 1248.399929] nova-conductor[52626]: DEBUG nova.conductor.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Rescheduling: True {{(pid=52626) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 1248.400170] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d2ff993d-35d8-479c-bb3e-2c06080896d0.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d2ff993d-35d8-479c-bb3e-2c06080896d0. [ 1248.400505] nova-conductor[52626]: WARNING nova.scheduler.utils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d2ff993d-35d8-479c-bb3e-2c06080896d0. [ 1248.423896] nova-conductor[52626]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] deallocate_for_instance() {{(pid=52626) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1248.442020] nova-conductor[52626]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Instance cache missing network info. {{(pid=52626) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1248.445225] nova-conductor[52626]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Updating instance_info_cache with network_info: [] {{(pid=52626) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}}