[ 474.969739] nova-conductor[51797]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 476.187888] nova-conductor[51797]: DEBUG oslo_db.sqlalchemy.engines [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51797) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 476.213983] nova-conductor[51797]: DEBUG nova.context [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),ca1c0d21-d6a1-418c-abcf-92608d1b00f5(cell1) {{(pid=51797) load_cells /opt/stack/nova/nova/context.py:464}} [ 476.215841] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51797) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 476.216082] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51797) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 476.216553] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=51797) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 476.216883] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51797) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 476.217076] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51797) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 476.218028] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=51797) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 476.223281] nova-conductor[51797]: DEBUG oslo_db.sqlalchemy.engines [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51797) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 476.223638] nova-conductor[51797]: DEBUG oslo_db.sqlalchemy.engines [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51797) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 476.283633] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Acquiring lock "singleton_lock" {{(pid=51797) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 476.283803] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Acquired lock "singleton_lock" {{(pid=51797) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 476.284043] nova-conductor[51797]: DEBUG oslo_concurrency.lockutils [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Releasing lock "singleton_lock" {{(pid=51797) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 476.284462] nova-conductor[51797]: INFO oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Starting 2 workers [ 476.289092] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Started child 52216 {{(pid=51797) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 476.292692] nova-conductor[52216]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 476.292927] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Started child 52217 {{(pid=51797) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 476.293581] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Full set of CONF: {{(pid=51797) wait /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:649}} [ 476.293791] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ******************************************************************************** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 476.293938] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] Configuration options gathered from: {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 476.294126] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] command line args: ['--config-file', '/etc/nova/nova.conf'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 476.294453] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] config files: ['/etc/nova/nova.conf'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 476.294594] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ================================================================================ {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 476.294999] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] allow_resize_to_same_host = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.295277] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] arq_binding_timeout = 300 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.295480] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] block_device_allocate_retries = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.295665] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] block_device_allocate_retries_interval = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.295867] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cert = self.pem {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.296132] nova-conductor[52217]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 476.296278] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute_driver = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.296278] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute_monitors = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.296523] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] config_dir = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.296770] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] config_drive_format = iso9660 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.296928] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] config_file = ['/etc/nova/nova.conf'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.297128] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] config_source = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.297322] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] console_host = devstack {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.297521] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] control_exchange = nova {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.297726] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cpu_allocation_ratio = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.297903] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] daemon = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.298106] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] debug = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.298307] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] default_access_ip_network_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.298500] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] default_availability_zone = nova {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.298670] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] default_ephemeral_format = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.298955] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.299150] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] default_schedule_zone = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.299308] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] disk_allocation_ratio = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.299477] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] enable_new_services = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.299739] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] enabled_apis = ['osapi_compute'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.299948] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] enabled_ssl_apis = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.300134] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] flat_injected = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.300305] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] force_config_drive = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.300479] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] force_raw_images = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.300647] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] graceful_shutdown_timeout = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.300821] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] heal_instance_info_cache_interval = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.301644] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] host = devstack {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.301842] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.302035] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] initial_disk_allocation_ratio = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.302196] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] initial_ram_allocation_ratio = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.302448] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.302649] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instance_build_timeout = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.302836] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instance_delete_interval = 300 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.303019] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instance_format = [instance: %(uuid)s] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.303184] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instance_name_template = instance-%08x {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.303347] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instance_usage_audit = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.303530] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instance_usage_audit_period = month {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.303820] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.303878] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] instances_path = /opt/stack/data/nova/instances {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.304040] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] internal_service_availability_zone = internal {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.304222] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] key = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.304341] nova-conductor[52216]: DEBUG oslo_db.sqlalchemy.engines [None req-9af4c471-de86-4dbc-9dac-6cd35efd8711 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52216) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 476.304406] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] live_migration_retry_count = 30 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.304578] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_config_append = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.304859] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.304922] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_dir = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.305099] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.305230] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_options = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.305399] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_rotate_interval = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.305602] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_rotate_interval_type = days {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306329] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] log_rotation_type = none {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306329] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306329] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306329] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306489] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306522] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306714] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] long_rpc_timeout = 1800 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.306819] nova-conductor[52217]: DEBUG oslo_db.sqlalchemy.engines [None req-e9e23f9a-fcd3-4d64-a297-f5cfe8f22d27 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52217) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 476.306869] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] max_concurrent_builds = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.307036] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] max_concurrent_live_migrations = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.307224] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] max_concurrent_snapshots = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.307378] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] max_local_block_devices = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.307557] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] max_logfile_count = 30 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.307718] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] max_logfile_size_mb = 200 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.307875] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] maximum_instance_delete_attempts = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.308076] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] metadata_listen = 0.0.0.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.308292] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] metadata_listen_port = 8775 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.308461] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] metadata_workers = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.308609] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] migrate_max_retries = -1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.308765] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] mkisofs_cmd = genisoimage {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.308987] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] my_block_storage_ip = 10.180.1.21 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.309125] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] my_ip = 10.180.1.21 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.309286] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] network_allocate_retries = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.309474] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.309633] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] osapi_compute_listen = 0.0.0.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.309792] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] osapi_compute_listen_port = 8774 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.310130] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] osapi_compute_unique_server_name_scope = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.310130] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] osapi_compute_workers = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.310283] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] password_length = 12 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.310430] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] periodic_enable = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.310577] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] periodic_fuzzy_delay = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.310733] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] pointer_model = usbtablet {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311155] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] preallocate_images = none {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311155] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] publish_errors = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311248] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] pybasedir = /opt/stack/nova {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311346] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ram_allocation_ratio = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311494] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rate_limit_burst = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311653] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rate_limit_except_level = CRITICAL {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311806] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rate_limit_interval = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.311955] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] reboot_timeout = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.312116] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] reclaim_instance_interval = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.312264] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] record = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.312413] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] reimage_timeout_per_gb = 20 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.312561] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] report_interval = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.312715] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rescue_timeout = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.312864] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] reserved_host_cpus = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.313027] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] reserved_host_disk_mb = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.313516] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] reserved_host_memory_mb = 512 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.313697] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] reserved_huge_pages = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.313858] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] resize_confirm_window = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.314023] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] resize_fs_using_block_device = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.314183] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] resume_guests_state_on_host_boot = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.314352] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.314527] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rpc_response_timeout = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.314700] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] run_external_periodic_tasks = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.314865] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] running_deleted_instance_action = reap {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.315045] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] running_deleted_instance_poll_interval = 1800 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.315199] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] running_deleted_instance_timeout = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.315346] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler_instance_sync_interval = 120 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.315497] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_down_time = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.315679] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] servicegroup_driver = db {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.315849] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] shelved_offload_time = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.316009] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] shelved_poll_interval = 3600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.316179] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] shutdown_timeout = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.316335] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] source_is_ipv6 = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.316489] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ssl_only = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.316648] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] state_path = /opt/stack/data/nova {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.316805] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] sync_power_state_interval = 600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.316974] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] sync_power_state_pool_size = 1000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.317144] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] syslog_log_facility = LOG_USER {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.317298] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] tempdir = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.317453] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] timeout_nbd = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.317657] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] transport_url = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.317888] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] update_resources_interval = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.317959] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] use_cow_images = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.318144] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] use_eventlog = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.318303] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] use_journal = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.318457] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] use_json = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.318610] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] use_rootwrap_daemon = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.318791] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] use_stderr = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.318992] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] use_syslog = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.319123] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vcpu_pin_set = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.319284] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vif_plugging_is_fatal = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.319456] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vif_plugging_timeout = 300 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.319661] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] virt_mkfs = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.319816] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] volume_usage_poll_interval = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.319966] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] watch_log_file = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.320153] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] web = /usr/share/spice-html5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 476.320432] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_concurrency.disable_process_locking = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.320628] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.320831] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.320996] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.321189] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.321352] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.321514] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.321735] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.auth_strategy = keystone {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.321924] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.compute_link_prefix = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.322200] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.322295] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.dhcp_domain = novalocal {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.322471] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.enable_instance_password = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.322647] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.glance_link_prefix = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.322820] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.323008] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.323170] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.instance_list_per_project_cells = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.323323] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.list_records_by_skipping_down_cells = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.323474] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.local_metadata_per_cell = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.323631] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.max_limit = 1000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.323790] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.metadata_cache_expiration = 15 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.323957] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.neutron_default_tenant_id = default {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.324139] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.use_forwarded_for = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.324297] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.use_neutron_default_nets = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.324460] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.324636] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.324800] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.324959] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.325136] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.vendordata_dynamic_targets = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.325290] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.vendordata_jsonfile_path = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.325491] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.325729] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.backend = dogpile.cache.memcached {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.325923] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.backend_argument = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.326143] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.config_prefix = cache.oslo {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.326309] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.dead_timeout = 60.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.326467] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.debug_cache_backend = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.326644] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.enable_retry_client = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.326803] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.enable_socket_keepalive = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.326972] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.enabled = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.327130] nova-conductor[52216]: DEBUG nova.service [None req-9af4c471-de86-4dbc-9dac-6cd35efd8711 None None] Creating RPC server for service conductor {{(pid=52216) start /opt/stack/nova/nova/service.py:182}} [ 476.327170] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.expiration_time = 600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.327292] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.hashclient_retry_attempts = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.327446] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.hashclient_retry_delay = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.327604] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_dead_retry = 300 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.327759] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_password = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.327911] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.328083] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.328245] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_pool_maxsize = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.328538] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.328593] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_sasl_enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.328721] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.328897] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_socket_timeout = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.329069] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.memcache_username = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.329229] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.proxies = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.329384] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.retry_attempts = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.329542] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.retry_delay = 0.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.329705] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.socket_keepalive_count = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.329861] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.socket_keepalive_idle = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.329964] nova-conductor[52217]: DEBUG nova.service [None req-e9e23f9a-fcd3-4d64-a297-f5cfe8f22d27 None None] Creating RPC server for service conductor {{(pid=52217) start /opt/stack/nova/nova/service.py:182}} [ 476.330029] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.socket_keepalive_interval = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.330187] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.tls_allowed_ciphers = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.330338] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.tls_cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.330494] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.tls_certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.330647] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.tls_enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.330797] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cache.tls_keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.331007] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.331207] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.auth_type = password {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.331365] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.331553] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.catalog_info = volumev3::publicURL {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.331709] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.331873] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.332062] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.cross_az_attach = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.332222] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.debug = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.332373] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.endpoint_template = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.332551] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.http_retries = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.332723] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.332880] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.333048] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.os_region_name = RegionOne {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.333209] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.333360] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cinder.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.333572] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.333673] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.cpu_dedicated_set = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.333824] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.cpu_shared_set = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.333984] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.image_type_exclude_list = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.334152] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.334330] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.max_concurrent_disk_ops = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.334486] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.max_disk_devices_to_attach = -1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.334639] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.334802] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.334957] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.resource_provider_association_refresh = 300 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.335121] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.shutdown_retry_interval = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.335294] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.335468] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] conductor.workers = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.335644] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] console.allowed_origins = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.335797] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] console.ssl_ciphers = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.335959] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] console.ssl_minimum_version = default {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.336154] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] consoleauth.token_ttl = 600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.336342] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.336493] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.336650] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.336802] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.connect_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.336950] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.connect_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.337108] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.endpoint_override = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.337264] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.337413] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.337566] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.max_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.337718] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.min_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.337866] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.region_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.338023] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.service_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.338187] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.service_type = accelerator {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.338340] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.338486] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.status_code_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339295] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.status_code_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339295] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339295] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339295] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] cyborg.version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339295] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.backend = sqlalchemy {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339495] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.connection = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339621] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.connection_debug = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339783] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.connection_parameters = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.339945] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.connection_recycle_time = 3600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.340122] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.connection_trace = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.340277] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.db_inc_retry_interval = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.340653] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.db_max_retries = 20 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.340653] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.db_max_retry_interval = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.340744] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.db_retry_interval = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.340879] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.max_overflow = 50 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.341034] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.max_pool_size = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.341193] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.max_retries = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.341823] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.mysql_enable_ndb = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.341823] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.341823] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.mysql_wsrep_sync_wait = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.341823] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.pool_timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.341989] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.retry_interval = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.342106] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.slave_connection = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.342257] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.sqlite_synchronous = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.342413] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] database.use_db_reconnect = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.342585] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.backend = sqlalchemy {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.342801] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.connection = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.342926] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.connection_debug = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.343507] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.connection_parameters = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.343507] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.connection_recycle_time = 3600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.343507] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.connection_trace = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.343507] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.db_inc_retry_interval = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.343686] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.db_max_retries = 20 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.343797] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.db_max_retry_interval = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.343943] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.db_retry_interval = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.344106] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.max_overflow = 50 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.344254] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.max_pool_size = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.344404] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.max_retries = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.344558] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.mysql_enable_ndb = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.344720] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.344863] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.345064] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.pool_timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.345186] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.retry_interval = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.345325] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.slave_connection = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.345484] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] api_database.sqlite_synchronous = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.345669] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] devices.enabled_mdev_types = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.345838] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.345998] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ephemeral_storage_encryption.enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.346440] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.346440] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.api_servers = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.346543] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.346650] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.346798] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.346975] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.connect_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.347136] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.connect_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.347310] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.debug = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.347511] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.default_trusted_certificate_ids = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.347692] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.enable_certificate_validation = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.347940] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.enable_rbd_download = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.347998] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.endpoint_override = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.348143] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.348303] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.348463] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.max_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.348586] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.min_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.348739] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.num_retries = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349030] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.rbd_ceph_conf = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349090] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.rbd_connect_timeout = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349205] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.rbd_pool = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349444] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.rbd_user = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349508] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.region_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349641] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.service_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349801] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.service_type = image {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.349955] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.350115] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.status_code_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.350264] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.status_code_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.350413] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.350585] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.350755] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.verify_glance_signatures = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.350906] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] glance.version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.351079] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] guestfs.debug = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.351263] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.config_drive_cdrom = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.351419] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.config_drive_inject_password = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.351579] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.351736] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.enable_instance_metrics_collection = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.351890] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.enable_remotefx = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.351930] nova-conductor[52217]: DEBUG nova.service [None req-e9e23f9a-fcd3-4d64-a297-f5cfe8f22d27 None None] Join ServiceGroup membership for this service conductor {{(pid=52217) start /opt/stack/nova/nova/service.py:199}} [ 476.352063] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.instances_path_share = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.352172] nova-conductor[52217]: DEBUG nova.servicegroup.drivers.db [None req-e9e23f9a-fcd3-4d64-a297-f5cfe8f22d27 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52217) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 476.352220] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.iscsi_initiator_list = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.352396] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.limit_cpu_features = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.352550] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.352701] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.352858] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.power_state_check_timeframe = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.353038] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.353213] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.353369] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.use_multipath_io = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.353520] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.volume_attach_retry_count = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.353674] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.353831] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.vswitch_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.353985] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.354161] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] mks.enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.354722] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.354922] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] image_cache.manager_interval = 2400 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.355096] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] image_cache.precache_concurrency = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.355261] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] image_cache.remove_unused_base_images = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.355421] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.355579] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.355910] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] image_cache.subdirectory_name = _base {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.355967] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.api_max_retries = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.356098] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.api_retry_interval = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.356251] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.356408] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.auth_type = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.356559] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.356707] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.356867] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.357029] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.connect_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.357216] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.connect_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.357344] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.endpoint_override = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.357490] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.357640] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.357812] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.max_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.357953] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.min_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.358106] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.partition_key = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.358252] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.peer_list = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.358405] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.region_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.358558] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.serial_console_state_timeout = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.358720] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.service_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.358888] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.service_type = baremetal {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.359056] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.359211] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.status_code_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.359360] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.status_code_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.359509] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.359681] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.359837] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ironic.version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.359871] nova-conductor[52216]: DEBUG nova.service [None req-9af4c471-de86-4dbc-9dac-6cd35efd8711 None None] Join ServiceGroup membership for this service conductor {{(pid=52216) start /opt/stack/nova/nova/service.py:199}} [ 476.360043] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.360130] nova-conductor[52216]: DEBUG nova.servicegroup.drivers.db [None req-9af4c471-de86-4dbc-9dac-6cd35efd8711 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52216) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 476.360257] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] key_manager.fixed_key = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.360449] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.360609] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.barbican_api_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.360778] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.barbican_endpoint = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.360962] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.barbican_endpoint_type = public {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.361139] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.barbican_region_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.361306] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.361457] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.361612] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.361766] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.361912] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.362077] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.number_of_retries = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.362248] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.retry_delay = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.362421] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.send_service_user_token = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.362577] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.362725] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.362902] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.verify_ssl = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.363040] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican.verify_ssl_path = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.363217] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.363397] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.auth_type = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.363545] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.363694] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.363850] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.364018] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.364171] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.364321] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.364467] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] barbican_service_user.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.364647] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.approle_role_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.364799] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.approle_secret_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.364947] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.365105] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.365257] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.365409] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.365558] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.365747] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.kv_mountpoint = secret {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.365944] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.kv_version = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.366115] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.namespace = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.366264] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.root_token_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.366417] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.366564] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.ssl_ca_crt_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.366711] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.366863] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.use_ssl = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.367050] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.367238] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.367393] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.367551] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.367741] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.connect_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.367902] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.connect_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.368078] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.endpoint_override = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.368233] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.368379] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.368526] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.max_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.368673] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.min_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.368841] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.region_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.368991] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.service_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.369164] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.service_type = identity {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.369315] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.369462] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.status_code_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.369611] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.status_code_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.369759] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.369928] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.370089] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] keystone.version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.370313] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.connection_uri = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.370489] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.cpu_mode = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.370647] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.cpu_model_extra_flags = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.370806] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.cpu_models = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.370983] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.cpu_power_governor_high = performance {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.371157] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.cpu_power_governor_low = powersave {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.371312] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.cpu_power_management = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.371491] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.371662] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.device_detach_attempts = 8 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.371816] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.device_detach_timeout = 20 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.371971] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.disk_cachemodes = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.372132] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.disk_prefix = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.372289] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.enabled_perf_events = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.372446] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.file_backed_memory = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.372601] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.gid_maps = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.372755] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.hw_disk_discard = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.372904] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.hw_machine_type = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.373071] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.images_rbd_ceph_conf = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.373226] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.373382] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.373540] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.images_rbd_glance_store_name = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.373699] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.images_rbd_pool = rbd {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.373857] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.images_type = default {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.374012] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.images_volume_group = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.374247] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.inject_key = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.374421] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.inject_partition = -2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.374563] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.inject_password = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.374740] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.iscsi_iface = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.374894] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.iser_use_multipath = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.375057] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_bandwidth = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.375212] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.375363] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_downtime = 500 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.375533] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.375702] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.375856] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_inbound_addr = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.376016] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.376172] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_permit_post_copy = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.376324] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_scheme = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.376482] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_timeout_action = abort {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.376635] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_tunnelled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.376787] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_uri = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.377059] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.live_migration_with_native_tls = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.377116] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.max_queues = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.377238] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.377418] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.nfs_mount_options = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.377767] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.377936] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.378106] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.num_iser_scan_tries = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.378266] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.num_memory_encrypted_guests = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.378422] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.378578] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.num_pcie_ports = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.378736] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.num_volume_scan_tries = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.378992] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.pmem_namespaces = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.379177] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.quobyte_client_cfg = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.379413] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.379572] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rbd_connect_timeout = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.379731] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.379885] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.380049] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rbd_secret_uuid = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.380201] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rbd_user = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.380355] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.380542] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.remote_filesystem_transport = ssh {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.380700] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rescue_image_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.380850] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rescue_kernel_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.380999] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rescue_ramdisk_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.381190] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.381342] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.rx_queue_size = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.381498] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.smbfs_mount_options = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.381705] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.381871] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.snapshot_compression = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.382030] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.snapshot_image_format = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.382243] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.382403] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.sparse_logical_volumes = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.382558] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.swtpm_enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.382716] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.swtpm_group = tss {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.382874] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.swtpm_user = tss {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.383041] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.sysinfo_serial = unique {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.383195] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.tx_queue_size = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.383349] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.uid_maps = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.383509] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.use_virtio_for_bridges = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.383670] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.virt_type = kvm {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.383829] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.volume_clear = zero {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.383981] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.volume_clear_size = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.384149] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.volume_use_multipath = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.384300] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.vzstorage_cache_path = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.384462] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.384624] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.vzstorage_mount_group = qemu {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.384778] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.vzstorage_mount_opts = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.384934] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.385152] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.385313] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.vzstorage_mount_user = stack {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.385472] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.385657] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.385824] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.auth_type = password {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.385975] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.386140] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.386298] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.386466] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.connect_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.386614] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.connect_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.386779] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.default_floating_pool = public {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.386930] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.endpoint_override = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.387095] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.extension_sync_interval = 600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.387245] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.http_retries = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.387395] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.387543] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.387719] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.max_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.387891] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.388051] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.min_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.388215] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.ovs_bridge = br-int {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.388370] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.physnets = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.388534] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.region_name = RegionOne {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.388704] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.service_metadata_proxy = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.388847] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.service_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.389019] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.service_type = network {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.389183] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.389333] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.status_code_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.389482] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.status_code_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.389631] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.389800] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.389967] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] neutron.version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.390176] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] notifications.bdms_in_notifications = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.390350] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] notifications.default_level = INFO {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.390515] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] notifications.notification_format = unversioned {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.390671] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] notifications.notify_on_state_change = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.390836] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.391035] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] pci.alias = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.391219] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] pci.device_spec = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.391382] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] pci.report_in_placement = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.391572] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.391738] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.auth_type = password {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.391931] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.392096] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.392249] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.392404] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.392555] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.connect_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.392703] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.connect_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.392852] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.default_domain_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.392999] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.default_domain_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.393158] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.domain_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.393305] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.domain_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.393455] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.endpoint_override = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.393610] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.393830] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.394047] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.max_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.394211] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.min_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.394380] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.password = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.394534] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.project_domain_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.394708] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.project_domain_name = Default {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.394852] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.project_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.395024] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.project_name = service {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.395186] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.region_name = RegionOne {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.395337] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.service_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.395500] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.service_type = placement {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.395658] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.395810] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.status_code_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.395963] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.status_code_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.396126] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.system_scope = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.396273] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.396422] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.trust_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.396572] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.user_domain_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.396728] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.user_domain_name = Default {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.396886] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.user_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.397125] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.username = placement {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.397313] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.397468] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] placement.version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.397666] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.cores = 20 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.397834] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.count_usage_from_placement = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.398008] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.398195] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.injected_file_content_bytes = 10240 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.398353] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.injected_file_path_length = 255 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.398509] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.injected_files = 5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.398665] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.instances = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.398821] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.key_pairs = 100 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.398978] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.metadata_items = 128 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.399151] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.ram = 51200 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.399308] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.recheck_quota = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.399465] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.server_group_members = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.399621] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] quota.server_groups = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.399784] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rdp.enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.400118] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.400331] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.400517] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.400693] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.image_metadata_prefilter = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.400869] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.401051] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.max_attempts = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.401227] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.max_placement_results = 1000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.401398] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.401568] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.query_placement_for_availability_zone = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.401728] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.query_placement_for_image_type_support = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.401901] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.402098] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] scheduler.workers = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.402304] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.402476] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.402672] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.402840] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.403023] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.403183] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.403337] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.403535] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.403693] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.host_subset_size = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.403847] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.403998] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.404166] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.isolated_hosts = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.404324] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.isolated_images = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.404475] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.404623] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.404792] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.pci_in_placement = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.404928] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.405090] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.405247] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.405399] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.405570] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.405736] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.405915] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.track_instance_changes = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.406093] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.406265] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] metrics.required = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.406419] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] metrics.weight_multiplier = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.406572] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.406728] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] metrics.weight_setting = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.407048] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.407216] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] serial_console.enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.407403] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] serial_console.port_range = 10000:20000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.407566] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.407752] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.407917] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] serial_console.serialproxy_port = 6083 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.408086] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.408251] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.auth_type = password {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.408412] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.408561] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.408715] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.408866] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.409017] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.409182] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.send_service_user_token = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.409333] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.409482] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] service_user.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.409644] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.agent_enabled = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.409801] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.410142] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.410360] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.410525] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.html5proxy_port = 6082 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.410681] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.image_compression = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.410851] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.jpeg_compression = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.411007] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.playback_compression = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.411175] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.server_listen = 127.0.0.1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.411332] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.411482] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.streaming_mode = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.411631] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] spice.zlib_compression = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.411792] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] upgrade_levels.baseapi = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.411939] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] upgrade_levels.cert = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.412110] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] upgrade_levels.compute = auto {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.412262] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] upgrade_levels.conductor = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.412409] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] upgrade_levels.scheduler = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.412567] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.412722] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.auth_type = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.412870] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.413027] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.413188] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.413341] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.413490] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.413645] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.413796] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vendordata_dynamic_auth.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.413988] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.api_retry_count = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.414152] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.ca_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.414323] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.cache_prefix = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.414474] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.cluster_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.414629] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.connection_pool_size = 10 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.414777] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.console_delay_seconds = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.414922] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.datastore_regex = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.415103] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.host_ip = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.415253] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.host_password = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.415405] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.host_port = 443 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.415574] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.host_username = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.415741] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.415898] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.integration_bridge = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.416068] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.maximum_objects = 100 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.416224] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.pbm_default_policy = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.416379] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.pbm_enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.416528] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.pbm_wsdl_location = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.416689] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.416845] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.serial_port_proxy_uri = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.416998] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.serial_port_service_uri = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.417168] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.task_poll_interval = 0.5 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.417322] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.use_linked_clone = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.417484] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.vnc_keymap = en-us {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.417670] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.vnc_port = 5900 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.417829] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vmware.vnc_port_total = 10000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.418049] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.auth_schemes = ['none'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.418234] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.enabled = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.418539] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.418724] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.418883] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.novncproxy_port = 6080 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.419065] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.server_listen = 127.0.0.1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.419233] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.419386] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.vencrypt_ca_certs = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.419537] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.vencrypt_client_cert = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.419686] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] vnc.vencrypt_client_key = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.419890] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.420085] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.disable_deep_image_inspection = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.420250] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.420405] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.420559] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.420711] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.disable_rootwrap = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.420867] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.enable_numa_live_migration = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.421029] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.421177] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.421328] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.421480] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.libvirt_disable_apic = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.421632] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.421784] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.421932] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.422095] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.422249] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.422399] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.422551] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.422701] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.422855] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.423019] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.423212] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.423387] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.client_socket_timeout = 900 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.423543] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.default_pool_size = 1000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.423703] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.keep_alive = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.423862] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.max_header_line = 16384 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.424028] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.secure_proxy_ssl_header = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.424188] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.ssl_ca_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.424337] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.ssl_cert_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.424486] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.ssl_key_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.424639] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.tcp_keepidle = 600 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.424809] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.424961] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] zvm.ca_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.425124] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] zvm.cloud_connector_url = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.425335] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.425488] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] zvm.reachable_timeout = 300 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.425732] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.enforce_new_defaults = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.425904] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.enforce_scope = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.426099] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.policy_default_rule = default {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.426303] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.426486] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.policy_file = policy.yaml {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.426679] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.426852] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.427011] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.427171] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.427325] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.427519] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.427722] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.427932] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.connection_string = messaging:// {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.428126] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.enabled = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.428308] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.es_doc_type = notification {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.428480] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.es_scroll_size = 10000 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.428642] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.es_scroll_time = 2m {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.428797] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.filter_error_trace = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.428954] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.hmac_keys = SECRET_KEY {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.429124] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.sentinel_service_name = mymaster {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.429306] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.socket_timeout = 0.1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.429470] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] profiler.trace_sqlalchemy = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.429662] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] remote_debug.host = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.429834] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] remote_debug.port = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.430282] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.430282] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.430402] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.430558] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.430722] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.430879] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.431046] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.431206] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.431363] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.431513] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.431678] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.431841] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.432008] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.432177] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.432329] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.432499] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.432651] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.432809] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.432967] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.433135] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.433292] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.433449] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.433602] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.433760] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.433916] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.434093] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.ssl = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.434259] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.434418] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.434574] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.434736] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.434895] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_rabbit.ssl_version = {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.435111] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.435277] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_notifications.retry = -1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.435470] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.435661] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_messaging_notifications.transport_url = **** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.435859] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.auth_section = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.436024] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.auth_type = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.436180] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.cafile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.436330] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.certfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.436486] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.collect_timing = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.436637] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.connect_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.436790] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.connect_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.436940] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.endpoint_id = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.437099] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.endpoint_override = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.437253] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.insecure = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.437401] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.keyfile = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.437549] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.max_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.437728] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.min_version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.437888] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.region_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.438069] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.service_name = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.438220] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.service_type = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.438376] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.split_loggers = False {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.438525] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.status_code_retries = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.438673] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.status_code_retry_delay = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.438821] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.timeout = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.438966] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.valid_interfaces = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.439125] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_limit.version = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.439329] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_reports.file_event_handler = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.439490] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.439641] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] oslo_reports.log_dir = None {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 476.439767] nova-conductor[51797]: DEBUG oslo_service.service [None req-35279ecb-ca7c-4447-844f-47b53bcb94f2 None None] ******************************************************************************** {{(pid=51797) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 561.445172] nova-conductor[52216]: DEBUG oslo_db.sqlalchemy.engines [None req-5befa0c3-b5f6-4929-a038-2f3148ae0a89 None None] Parent process 51797 forked (52216) with an open database connection, which is being discarded and recreated. {{(pid=52216) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 594.380913] nova-conductor[52217]: DEBUG oslo_db.sqlalchemy.engines [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Parent process 51797 forked (52217) with an open database connection, which is being discarded and recreated. {{(pid=52217) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 594.871145] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Took 0.48 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 594.895498] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.895775] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.897324] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.901629] nova-conductor[52217]: DEBUG oslo_db.sqlalchemy.engines [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52217) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 594.956115] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.956365] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.956845] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.957225] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.957407] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.957564] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.964801] nova-conductor[52217]: DEBUG oslo_db.sqlalchemy.engines [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52217) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 594.978489] nova-conductor[52217]: DEBUG nova.quota [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Getting quotas for project 3b081340a09140ed9283752785719b50. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 594.981488] nova-conductor[52217]: DEBUG nova.quota [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Getting quotas for user bff99d63ab404ea09c032a89af2616e0 and project 3b081340a09140ed9283752785719b50. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 594.986565] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 594.987149] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.987360] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.987513] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.991783] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 594.992462] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.992678] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.992812] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 595.014066] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.014284] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.014449] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 595.014739] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52217) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 595.014891] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52217) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 595.015389] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.015583] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.015744] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 595.016058] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.016232] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.016384] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 595.021876] nova-conductor[52217]: INFO nova.compute.rpcapi [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 595.022315] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-1fc78fff-cd80-457c-b556-2f38b58f381f None None] Releasing lock "compute-rpcapi-router" {{(pid=52217) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 604.064765] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 1c80719e-14e8-467f-9195-683f681b0fd1 was re-scheduled: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information.\n'] [ 604.069675] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 604.070611] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1c80719e-14e8-467f-9195-683f681b0fd1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1c80719e-14e8-467f-9195-683f681b0fd1. [ 604.070611] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1c80719e-14e8-467f-9195-683f681b0fd1. [ 604.134244] nova-conductor[52217]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 604.536197] nova-conductor[52217]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 604.544606] nova-conductor[52217]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.844811] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Took 0.27 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 605.864186] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.864683] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.004s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.864954] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.914792] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.915503] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.915503] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.915625] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.916197] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.916197] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.928451] nova-conductor[52217]: DEBUG nova.quota [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Getting quotas for project 9468510ce958424da5fc4ec68a07d6e9. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 605.932646] nova-conductor[52217]: DEBUG nova.quota [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Getting quotas for user 0a4bfcff342a4ccbbc38ba84f9b50561 and project 9468510ce958424da5fc4ec68a07d6e9. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 605.942757] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 605.942757] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.942757] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.942757] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.947612] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 605.948407] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.948681] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.948857] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.963136] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.963364] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.963569] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.273533] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 608.290715] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.290964] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.291193] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.329535] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.331855] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.331958] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.003s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.332291] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.332480] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.332618] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.345162] nova-conductor[52217]: DEBUG nova.quota [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Getting quotas for project ec0cec8728ec40b286d68178a70794e8. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 608.347675] nova-conductor[52217]: DEBUG nova.quota [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Getting quotas for user c96ee8fc6a5445098644e375c7df4919 and project ec0cec8728ec40b286d68178a70794e8. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 608.356013] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 608.356698] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.357029] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.358177] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.360959] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 608.361798] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.362139] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.362340] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.376701] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.376874] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.377162] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.275132] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 610.304395] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.304395] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.305991] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.310307] nova-conductor[52216]: DEBUG oslo_db.sqlalchemy.engines [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52216) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 610.375279] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.375279] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.375279] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.376094] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.376094] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.376094] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.384092] nova-conductor[52216]: DEBUG oslo_db.sqlalchemy.engines [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52216) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 610.397673] nova-conductor[52216]: DEBUG nova.quota [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Getting quotas for project 64a2dca870f64583ae77cb64d7eff903. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 610.400242] nova-conductor[52216]: DEBUG nova.quota [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Getting quotas for user 2f73951d67dc467a96738f87697c6f62 and project 64a2dca870f64583ae77cb64d7eff903. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 610.405547] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 610.406139] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.407253] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.407253] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.414093] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 610.414093] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.414291] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.414397] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.446423] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.446690] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.446813] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.447136] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52216) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.447217] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52216) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.447733] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.447894] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.448055] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.448394] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.448587] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.448741] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.455024] nova-conductor[52216]: INFO nova.compute.rpcapi [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 610.455865] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ec6f36e7-1f23-415f-9d98-9c23d00a393e None None] Releasing lock "compute-rpcapi-router" {{(pid=52216) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 611.433554] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 611.468653] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.469045] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.469593] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.524660] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.524660] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.524660] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.524951] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.525136] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.525289] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.537338] nova-conductor[52217]: DEBUG nova.quota [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Getting quotas for project 3b081340a09140ed9283752785719b50. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 611.539465] nova-conductor[52217]: DEBUG nova.quota [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Getting quotas for user bff99d63ab404ea09c032a89af2616e0 and project 3b081340a09140ed9283752785719b50. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 611.547707] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 611.548937] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.548937] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.548937] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.552882] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 611.553567] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.554163] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.554163] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.572658] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.572870] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.573261] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.642530] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 613.660225] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.660470] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.660642] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.694635] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.695296] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.695553] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.695920] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.696122] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.696281] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.710507] nova-conductor[52216]: DEBUG nova.quota [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Getting quotas for project 1c65b725b7474926832d3ffd92af67db. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 613.713828] nova-conductor[52216]: DEBUG nova.quota [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Getting quotas for user 9ca548ffb6fe4cd9a91d5a743fd851cd and project 1c65b725b7474926832d3ffd92af67db. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 613.720897] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 613.721396] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.721635] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.721853] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.725119] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 613.725779] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.726015] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.726195] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.750251] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.750251] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.750417] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.409906] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Took 0.24 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 615.427819] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.428067] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.428235] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.477702] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.477848] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.478408] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.478408] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.478651] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.479343] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.490078] nova-conductor[52217]: DEBUG nova.quota [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Getting quotas for project 359089d39b2f4c95960a6768f9b4d1e6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 615.495761] nova-conductor[52217]: DEBUG nova.quota [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Getting quotas for user 590bc22d21164771b6472358c1c3bfad and project 359089d39b2f4c95960a6768f9b4d1e6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 615.501820] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 615.503581] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.503581] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.503581] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.507597] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 615.507597] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.507597] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.507597] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.524213] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.524370] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.524540] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.187857] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 617.202339] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.202561] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.202735] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.214566] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 89af0723-c7fc-4d6f-90f5-6f69e7a3630b was re-scheduled: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information.\n'] [ 617.219145] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 617.219145] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 89af0723-c7fc-4d6f-90f5-6f69e7a3630b.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 89af0723-c7fc-4d6f-90f5-6f69e7a3630b. [ 617.219145] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 89af0723-c7fc-4d6f-90f5-6f69e7a3630b. [ 617.250342] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.250508] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.250669] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.251299] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.251299] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.251547] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.263019] nova-conductor[52217]: DEBUG nova.quota [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Getting quotas for project 1c65b725b7474926832d3ffd92af67db. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 617.266119] nova-conductor[52217]: DEBUG nova.quota [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Getting quotas for user 9ca548ffb6fe4cd9a91d5a743fd851cd and project 1c65b725b7474926832d3ffd92af67db. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 617.273744] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 617.274266] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.274656] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.275145] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.277979] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 617.278614] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.278843] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.279091] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.297349] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.297349] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.297499] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.304499] nova-conductor[52216]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 617.901553] nova-conductor[52216]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 617.910215] nova-conductor[52216]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.729975] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 622.746872] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.747163] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.747163] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.806045] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.806045] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.806045] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.806045] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.806045] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.806045] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.818962] nova-conductor[52216]: DEBUG nova.quota [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Getting quotas for project bd2bc2d1b262440f90066cdcb1bfc630. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 622.822094] nova-conductor[52216]: DEBUG nova.quota [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Getting quotas for user 7edc0db7734849eebeb3768cf13f3408 and project bd2bc2d1b262440f90066cdcb1bfc630. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 622.828298] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 622.828709] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.828933] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.829114] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.832111] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 622.832798] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.832937] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.833108] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.856143] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.856378] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.856917] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.843689] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 757b0e86-7d50-46c8-b69a-7e729d925cb1 was re-scheduled: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information.\n'] [ 623.844174] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 623.844257] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 757b0e86-7d50-46c8-b69a-7e729d925cb1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 757b0e86-7d50-46c8-b69a-7e729d925cb1. [ 623.844780] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 757b0e86-7d50-46c8-b69a-7e729d925cb1. [ 623.885305] nova-conductor[52217]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 623.973997] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe was re-scheduled: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information.\n'] [ 623.974577] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 623.974943] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe. [ 623.975068] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe. [ 623.995198] nova-conductor[52217]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 624.078146] nova-conductor[52217]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 624.079236] nova-conductor[52217]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 624.128115] nova-conductor[52217]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 624.129838] nova-conductor[52217]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.166269] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 625.184730] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.184730] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.184730] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.246883] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.247267] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.247267] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.247611] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.247781] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.247932] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.266283] nova-conductor[52217]: DEBUG nova.quota [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Getting quotas for project 0584adefba3a4307aed8fd8fcca5267f. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 625.271428] nova-conductor[52217]: DEBUG nova.quota [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Getting quotas for user efa9362057904c4eadee051c14b92935 and project 0584adefba3a4307aed8fd8fcca5267f. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 625.283438] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 625.284202] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.285050] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.285050] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.289381] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 625.290241] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.290550] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.290856] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.305378] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.305722] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.306085] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.322973] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 626.342064] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.342064] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.342064] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.397164] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.397386] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.397547] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.397890] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.398103] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.398262] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.412765] nova-conductor[52217]: DEBUG nova.quota [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Getting quotas for project d83638802c3948e7a158d4a31c0c904b. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 626.417047] nova-conductor[52217]: DEBUG nova.quota [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Getting quotas for user b85992f2684e4986aa987d719b82db40 and project d83638802c3948e7a158d4a31c0c904b. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 626.429575] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 626.430844] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.430844] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.430844] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.436217] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 626.436217] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.436217] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.436992] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.456922] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.457749] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.457749] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.055302] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 05155891-6002-4ac0-8386-62e8db523152 was re-scheduled: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information.\n'] [ 627.057720] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 627.058103] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 05155891-6002-4ac0-8386-62e8db523152.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 05155891-6002-4ac0-8386-62e8db523152. [ 627.058867] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 05155891-6002-4ac0-8386-62e8db523152. [ 627.160538] nova-conductor[52216]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 627.227314] nova-conductor[52216]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.235083] nova-conductor[52216]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.295248] nova-conductor[52217]: Traceback (most recent call last): [ 627.295248] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.295248] nova-conductor[52217]: return func(*args, **kwargs) [ 627.295248] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.295248] nova-conductor[52217]: selections = self._select_destinations( [ 627.295248] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.295248] nova-conductor[52217]: selections = self._schedule( [ 627.295248] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.295248] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 627.295248] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.295248] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 627.295248] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.295248] nova-conductor[52217]: ERROR nova.conductor.manager [ 627.308450] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.308676] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.308840] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.389767] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] [instance: 0f8eea4d-cada-419c-b352-0bb99b9548af] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.389767] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.389767] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.389767] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.398325] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 627.398325] nova-conductor[52217]: Traceback (most recent call last): [ 627.398325] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 627.398325] nova-conductor[52217]: return func(*args, **kwargs) [ 627.398325] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 627.398325] nova-conductor[52217]: selections = self._select_destinations( [ 627.398325] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 627.398325] nova-conductor[52217]: selections = self._schedule( [ 627.398325] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 627.398325] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 627.398325] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 627.398325] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 627.398325] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 627.398325] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 627.398325] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-0c941f8f-55ad-489d-9bd2-fd2d32270572 tempest-ServersAdmin275Test-747893895 tempest-ServersAdmin275Test-747893895-project-member] [instance: 0f8eea4d-cada-419c-b352-0bb99b9548af] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.385205] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f was re-scheduled: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information.\n'] [ 628.385457] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 628.385805] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f. [ 628.385898] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f. [ 628.413602] nova-conductor[52217]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 628.468022] nova-conductor[52217]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.472785] nova-conductor[52217]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.488725] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 07eb4258-4513-45f4-9789-0b362028abd7 was re-scheduled: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information.\n'] [ 629.492896] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 629.492896] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 07eb4258-4513-45f4-9789-0b362028abd7.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 07eb4258-4513-45f4-9789-0b362028abd7. [ 629.492896] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 07eb4258-4513-45f4-9789-0b362028abd7. [ 629.521788] nova-conductor[52217]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 629.624442] nova-conductor[52217]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.631714] nova-conductor[52217]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 629.647830] nova-conductor[52217]: Traceback (most recent call last): [ 629.647830] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 629.647830] nova-conductor[52217]: return func(*args, **kwargs) [ 629.647830] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 629.647830] nova-conductor[52217]: selections = self._select_destinations( [ 629.647830] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 629.647830] nova-conductor[52217]: selections = self._schedule( [ 629.647830] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 629.647830] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 629.647830] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 629.647830] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 629.647830] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.647830] nova-conductor[52217]: ERROR nova.conductor.manager [ 629.654858] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.655115] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.655282] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.837470] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] [instance: cfe73cf5-8318-45f3-a241-c34bb4c74f1d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 629.838614] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.838945] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.839382] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.845384] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 629.845384] nova-conductor[52217]: Traceback (most recent call last): [ 629.845384] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 629.845384] nova-conductor[52217]: return func(*args, **kwargs) [ 629.845384] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 629.845384] nova-conductor[52217]: selections = self._select_destinations( [ 629.845384] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 629.845384] nova-conductor[52217]: selections = self._schedule( [ 629.845384] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 629.845384] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 629.845384] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 629.845384] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 629.845384] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 629.845384] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 629.846616] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-6917563c-3f25-4b5a-aa02-1d08bab846af tempest-TenantUsagesTestJSON-413035115 tempest-TenantUsagesTestJSON-413035115-project-member] [instance: cfe73cf5-8318-45f3-a241-c34bb4c74f1d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 636.591542] nova-conductor[52216]: Traceback (most recent call last): [ 636.591542] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 636.591542] nova-conductor[52216]: return func(*args, **kwargs) [ 636.591542] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 636.591542] nova-conductor[52216]: selections = self._select_destinations( [ 636.591542] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 636.591542] nova-conductor[52216]: selections = self._schedule( [ 636.591542] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 636.591542] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 636.591542] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 636.591542] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 636.591542] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.591542] nova-conductor[52216]: ERROR nova.conductor.manager [ 636.604712] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 4fd28c4b-e5df-475b-bb3d-f163c9f5b436 was re-scheduled: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information.\n'] [ 636.605101] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 636.605264] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4fd28c4b-e5df-475b-bb3d-f163c9f5b436.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4fd28c4b-e5df-475b-bb3d-f163c9f5b436. [ 636.609287] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4fd28c4b-e5df-475b-bb3d-f163c9f5b436. [ 636.626601] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.626744] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.626932] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.653576] nova-conductor[52216]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 636.705364] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] [instance: 1aa4d32e-073b-4aa3-84a7-9d7c5fe45814] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 636.706214] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.706350] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.706512] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.712538] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 636.712538] nova-conductor[52216]: Traceback (most recent call last): [ 636.712538] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 636.712538] nova-conductor[52216]: return func(*args, **kwargs) [ 636.712538] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 636.712538] nova-conductor[52216]: selections = self._select_destinations( [ 636.712538] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 636.712538] nova-conductor[52216]: selections = self._schedule( [ 636.712538] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 636.712538] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 636.712538] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 636.712538] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 636.712538] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 636.712538] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 636.712967] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-773f6e6f-c2e7-4fbb-ba71-42c0877eda0a tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] [instance: 1aa4d32e-073b-4aa3-84a7-9d7c5fe45814] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 636.778949] nova-conductor[52216]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.784416] nova-conductor[52216]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 638.449825] nova-conductor[52216]: Traceback (most recent call last): [ 638.449825] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 638.449825] nova-conductor[52216]: return func(*args, **kwargs) [ 638.449825] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 638.449825] nova-conductor[52216]: selections = self._select_destinations( [ 638.449825] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 638.449825] nova-conductor[52216]: selections = self._schedule( [ 638.449825] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 638.449825] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 638.449825] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 638.449825] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 638.449825] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.449825] nova-conductor[52216]: ERROR nova.conductor.manager [ 638.456821] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.457076] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.457250] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.516150] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] [instance: 94385889-26e9-4091-8766-834be043c329] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 638.516150] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.516150] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.516150] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.525557] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 638.525557] nova-conductor[52216]: Traceback (most recent call last): [ 638.525557] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 638.525557] nova-conductor[52216]: return func(*args, **kwargs) [ 638.525557] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 638.525557] nova-conductor[52216]: selections = self._select_destinations( [ 638.525557] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 638.525557] nova-conductor[52216]: selections = self._schedule( [ 638.525557] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 638.525557] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 638.525557] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 638.525557] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 638.525557] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 638.525557] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 638.525557] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-5c49d3fe-460d-4b6f-9aa3-48eb16258ca5 tempest-VolumesAssistedSnapshotsTest-659397932 tempest-VolumesAssistedSnapshotsTest-659397932-project-member] [instance: 94385889-26e9-4091-8766-834be043c329] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.087245] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e was re-scheduled: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information.\n'] [ 639.089018] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 639.089018] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e. [ 639.089018] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e. [ 639.107931] nova-conductor[52216]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 639.187350] nova-conductor[52216]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.191285] nova-conductor[52216]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.120445] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance b1a72905-ae94-42e3-8926-0f81cb502942 was re-scheduled: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information.\n'] [ 640.121068] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 640.121327] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b1a72905-ae94-42e3-8926-0f81cb502942.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b1a72905-ae94-42e3-8926-0f81cb502942. [ 640.121548] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b1a72905-ae94-42e3-8926-0f81cb502942. [ 640.151856] nova-conductor[52217]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 640.242545] nova-conductor[52217]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.249195] nova-conductor[52217]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.554128] nova-conductor[52217]: Traceback (most recent call last): [ 642.554128] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.554128] nova-conductor[52217]: return func(*args, **kwargs) [ 642.554128] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.554128] nova-conductor[52217]: selections = self._select_destinations( [ 642.554128] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.554128] nova-conductor[52217]: selections = self._schedule( [ 642.554128] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.554128] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 642.554128] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.554128] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 642.554128] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.554128] nova-conductor[52217]: ERROR nova.conductor.manager [ 642.563519] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.563604] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.563770] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.610894] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] [instance: d81d9ec2-c548-4f1c-8ba6-623d919a2417] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 642.611749] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.612018] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.612225] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.615409] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 642.615409] nova-conductor[52217]: Traceback (most recent call last): [ 642.615409] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.615409] nova-conductor[52217]: return func(*args, **kwargs) [ 642.615409] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.615409] nova-conductor[52217]: selections = self._select_destinations( [ 642.615409] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.615409] nova-conductor[52217]: selections = self._schedule( [ 642.615409] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.615409] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 642.615409] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.615409] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 642.615409] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.615409] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.616015] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-4e223444-afc7-4428-b9d3-d477aa7917db tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] [instance: d81d9ec2-c548-4f1c-8ba6-623d919a2417] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.365018] nova-conductor[52216]: Traceback (most recent call last): [ 646.365018] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.365018] nova-conductor[52216]: return func(*args, **kwargs) [ 646.365018] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.365018] nova-conductor[52216]: selections = self._select_destinations( [ 646.365018] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.365018] nova-conductor[52216]: selections = self._schedule( [ 646.365018] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.365018] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 646.365018] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.365018] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 646.365018] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.365018] nova-conductor[52216]: ERROR nova.conductor.manager [ 646.374047] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.374546] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.374720] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.464933] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] [instance: 6b2679fa-6d24-4d18-895b-65c09575d7af] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 646.465886] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.466094] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.466312] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.471879] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 646.471879] nova-conductor[52216]: Traceback (most recent call last): [ 646.471879] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.471879] nova-conductor[52216]: return func(*args, **kwargs) [ 646.471879] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.471879] nova-conductor[52216]: selections = self._select_destinations( [ 646.471879] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.471879] nova-conductor[52216]: selections = self._schedule( [ 646.471879] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.471879] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 646.471879] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.471879] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 646.471879] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.471879] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.472428] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a55d9e52-65cd-43cd-82fd-280eacdbba42 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] [instance: 6b2679fa-6d24-4d18-895b-65c09575d7af] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.081596] nova-conductor[52217]: Traceback (most recent call last): [ 648.081596] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.081596] nova-conductor[52217]: return func(*args, **kwargs) [ 648.081596] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.081596] nova-conductor[52217]: selections = self._select_destinations( [ 648.081596] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.081596] nova-conductor[52217]: selections = self._schedule( [ 648.081596] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.081596] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 648.081596] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.081596] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 648.081596] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.081596] nova-conductor[52217]: ERROR nova.conductor.manager [ 648.089202] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.089601] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.091017] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.146484] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] [instance: 67a1092b-1252-4bbf-8888-5be15bf8109d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 648.147237] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.148541] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.148541] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.152473] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 648.152473] nova-conductor[52217]: Traceback (most recent call last): [ 648.152473] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.152473] nova-conductor[52217]: return func(*args, **kwargs) [ 648.152473] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.152473] nova-conductor[52217]: selections = self._select_destinations( [ 648.152473] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.152473] nova-conductor[52217]: selections = self._schedule( [ 648.152473] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.152473] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 648.152473] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.152473] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 648.152473] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.152473] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.152989] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-ac457091-6c12-4b72-afe9-6abd88bd1a46 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] [instance: 67a1092b-1252-4bbf-8888-5be15bf8109d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.078134] nova-conductor[52216]: Traceback (most recent call last): [ 650.078134] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.078134] nova-conductor[52216]: return func(*args, **kwargs) [ 650.078134] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.078134] nova-conductor[52216]: selections = self._select_destinations( [ 650.078134] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.078134] nova-conductor[52216]: selections = self._schedule( [ 650.078134] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.078134] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 650.078134] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.078134] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 650.078134] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.078134] nova-conductor[52216]: ERROR nova.conductor.manager [ 650.085201] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.085482] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.085578] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.144820] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 8f0af9ab-95e6-424b-97ba-e015cb9dcbf3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 650.145646] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.146111] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.146111] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.150752] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 650.150752] nova-conductor[52216]: Traceback (most recent call last): [ 650.150752] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.150752] nova-conductor[52216]: return func(*args, **kwargs) [ 650.150752] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.150752] nova-conductor[52216]: selections = self._select_destinations( [ 650.150752] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.150752] nova-conductor[52216]: selections = self._schedule( [ 650.150752] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.150752] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 650.150752] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.150752] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 650.150752] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.150752] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.150752] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-96018aa7-555a-437d-8092-4a99f8faabd1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 8f0af9ab-95e6-424b-97ba-e015cb9dcbf3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.498614] nova-conductor[52217]: Traceback (most recent call last): [ 650.498614] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.498614] nova-conductor[52217]: return func(*args, **kwargs) [ 650.498614] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.498614] nova-conductor[52217]: selections = self._select_destinations( [ 650.498614] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.498614] nova-conductor[52217]: selections = self._schedule( [ 650.498614] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.498614] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 650.498614] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.498614] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 650.498614] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.498614] nova-conductor[52217]: ERROR nova.conductor.manager [ 650.511127] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.511127] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.511127] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.582520] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] [instance: 5eb4982f-3953-459b-9d30-d1d4de19add1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 650.583351] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.583452] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.583656] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.587570] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 650.587570] nova-conductor[52217]: Traceback (most recent call last): [ 650.587570] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.587570] nova-conductor[52217]: return func(*args, **kwargs) [ 650.587570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.587570] nova-conductor[52217]: selections = self._select_destinations( [ 650.587570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.587570] nova-conductor[52217]: selections = self._schedule( [ 650.587570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.587570] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 650.587570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.587570] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 650.587570] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.587570] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.588111] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-0e451501-8503-40cb-b954-bb5999a8f5d9 tempest-ListImageFiltersTestJSON-1201066010 tempest-ListImageFiltersTestJSON-1201066010-project-member] [instance: 5eb4982f-3953-459b-9d30-d1d4de19add1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.045310] nova-conductor[52216]: Traceback (most recent call last): [ 654.045310] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.045310] nova-conductor[52216]: return func(*args, **kwargs) [ 654.045310] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.045310] nova-conductor[52216]: selections = self._select_destinations( [ 654.045310] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.045310] nova-conductor[52216]: selections = self._schedule( [ 654.045310] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.045310] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 654.045310] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.045310] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 654.045310] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.045310] nova-conductor[52216]: ERROR nova.conductor.manager [ 654.058429] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.058429] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.058662] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.103734] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] [instance: 8bec0e37-bde4-4299-aa3c-b47f53ad05f1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 654.103734] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.103995] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.103995] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.106796] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 654.106796] nova-conductor[52216]: Traceback (most recent call last): [ 654.106796] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.106796] nova-conductor[52216]: return func(*args, **kwargs) [ 654.106796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.106796] nova-conductor[52216]: selections = self._select_destinations( [ 654.106796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.106796] nova-conductor[52216]: selections = self._schedule( [ 654.106796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.106796] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 654.106796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.106796] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 654.106796] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.106796] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.107568] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-ad13b98d-cbd4-40a8-9cff-1d6ce9666f36 tempest-ServerMetadataNegativeTestJSON-1652162511 tempest-ServerMetadataNegativeTestJSON-1652162511-project-member] [instance: 8bec0e37-bde4-4299-aa3c-b47f53ad05f1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.231404] nova-conductor[52217]: Traceback (most recent call last): [ 656.231404] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.231404] nova-conductor[52217]: return func(*args, **kwargs) [ 656.231404] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.231404] nova-conductor[52217]: selections = self._select_destinations( [ 656.231404] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.231404] nova-conductor[52217]: selections = self._schedule( [ 656.231404] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.231404] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 656.231404] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.231404] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 656.231404] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.231404] nova-conductor[52217]: ERROR nova.conductor.manager [ 656.243935] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.244486] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.245889] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.330625] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] [instance: bf3a1f73-e1ac-43d7-abed-8612f3ac4d83] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 656.330625] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.330625] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.330625] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.333066] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 656.333066] nova-conductor[52217]: Traceback (most recent call last): [ 656.333066] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.333066] nova-conductor[52217]: return func(*args, **kwargs) [ 656.333066] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.333066] nova-conductor[52217]: selections = self._select_destinations( [ 656.333066] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.333066] nova-conductor[52217]: selections = self._schedule( [ 656.333066] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.333066] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 656.333066] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.333066] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 656.333066] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.333066] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.333592] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a6adc284-e048-4f6d-8f68-b76182a9b9d0 tempest-ServersWithSpecificFlavorTestJSON-749080009 tempest-ServersWithSpecificFlavorTestJSON-749080009-project-member] [instance: bf3a1f73-e1ac-43d7-abed-8612f3ac4d83] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.873697] nova-conductor[52216]: Traceback (most recent call last): [ 658.873697] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.873697] nova-conductor[52216]: return func(*args, **kwargs) [ 658.873697] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.873697] nova-conductor[52216]: selections = self._select_destinations( [ 658.873697] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.873697] nova-conductor[52216]: selections = self._schedule( [ 658.873697] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.873697] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 658.873697] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.873697] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 658.873697] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.873697] nova-conductor[52216]: ERROR nova.conductor.manager [ 658.880911] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.881165] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.881339] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.937410] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] [instance: e75bac69-980c-4979-9226-99ac807e2cef] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 658.938143] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.938353] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.938523] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.943262] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 658.943262] nova-conductor[52216]: Traceback (most recent call last): [ 658.943262] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.943262] nova-conductor[52216]: return func(*args, **kwargs) [ 658.943262] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.943262] nova-conductor[52216]: selections = self._select_destinations( [ 658.943262] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.943262] nova-conductor[52216]: selections = self._schedule( [ 658.943262] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.943262] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 658.943262] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.943262] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 658.943262] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.943262] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.943793] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-818359be-85d9-404a-954b-ddb9478cbfa6 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] [instance: e75bac69-980c-4979-9226-99ac807e2cef] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.064033] nova-conductor[52217]: Traceback (most recent call last): [ 659.064033] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.064033] nova-conductor[52217]: return func(*args, **kwargs) [ 659.064033] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.064033] nova-conductor[52217]: selections = self._select_destinations( [ 659.064033] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.064033] nova-conductor[52217]: selections = self._schedule( [ 659.064033] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.064033] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 659.064033] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.064033] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 659.064033] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.064033] nova-conductor[52217]: ERROR nova.conductor.manager [ 659.071027] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.071261] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.071448] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.133058] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] [instance: 51d2f224-36ac-4abd-ac58-8646366641dd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 659.133266] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.133472] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.133634] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.137410] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 659.137410] nova-conductor[52217]: Traceback (most recent call last): [ 659.137410] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.137410] nova-conductor[52217]: return func(*args, **kwargs) [ 659.137410] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.137410] nova-conductor[52217]: selections = self._select_destinations( [ 659.137410] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.137410] nova-conductor[52217]: selections = self._schedule( [ 659.137410] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.137410] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 659.137410] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.137410] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 659.137410] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.137410] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.139244] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-03c5d4ec-efa6-4b5c-8721-fd512dc56025 tempest-FloatingIPsAssociationTestJSON-984485256 tempest-FloatingIPsAssociationTestJSON-984485256-project-member] [instance: 51d2f224-36ac-4abd-ac58-8646366641dd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.079755] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 660.080360] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 660.080586] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73. [ 660.080796] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73. [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.117036] nova-conductor[52217]: Traceback (most recent call last): [ 660.117036] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.117036] nova-conductor[52217]: return func(*args, **kwargs) [ 660.117036] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.117036] nova-conductor[52217]: selections = self._select_destinations( [ 660.117036] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.117036] nova-conductor[52217]: selections = self._schedule( [ 660.117036] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.117036] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 660.117036] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.117036] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 660.117036] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.117036] nova-conductor[52217]: ERROR nova.conductor.manager [ 660.126909] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.127149] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.128151] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.191045] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] [instance: 1cb24bc5-f9b4-4c1f-86fe-e2fac8574647] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 660.194331] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.194331] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.194331] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.198680] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 660.198680] nova-conductor[52217]: Traceback (most recent call last): [ 660.198680] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.198680] nova-conductor[52217]: return func(*args, **kwargs) [ 660.198680] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.198680] nova-conductor[52217]: selections = self._select_destinations( [ 660.198680] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.198680] nova-conductor[52217]: selections = self._schedule( [ 660.198680] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.198680] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 660.198680] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.198680] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 660.198680] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.198680] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.199220] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-f05c7485-37b9-47c8-a67a-37c6ff08feca tempest-MigrationsAdminTest-714504537 tempest-MigrationsAdminTest-714504537-project-member] [instance: 1cb24bc5-f9b4-4c1f-86fe-e2fac8574647] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.340897] nova-conductor[52217]: Traceback (most recent call last): [ 661.340897] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.340897] nova-conductor[52217]: return func(*args, **kwargs) [ 661.340897] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.340897] nova-conductor[52217]: selections = self._select_destinations( [ 661.340897] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.340897] nova-conductor[52217]: selections = self._schedule( [ 661.340897] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.340897] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 661.340897] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.340897] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 661.340897] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.340897] nova-conductor[52217]: ERROR nova.conductor.manager [ 661.347148] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 661.347419] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 661.347591] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 661.402081] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: af016b23-eba6-4038-9f11-919791e06ce1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 661.402681] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 661.402906] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 661.403089] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 661.412174] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 661.412174] nova-conductor[52217]: Traceback (most recent call last): [ 661.412174] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 661.412174] nova-conductor[52217]: return func(*args, **kwargs) [ 661.412174] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 661.412174] nova-conductor[52217]: selections = self._select_destinations( [ 661.412174] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 661.412174] nova-conductor[52217]: selections = self._schedule( [ 661.412174] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 661.412174] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 661.412174] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 661.412174] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 661.412174] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 661.412174] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 661.412174] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-436388f0-b352-4320-b44c-fb30a591a0c1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: af016b23-eba6-4038-9f11-919791e06ce1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.828045] nova-conductor[52216]: Traceback (most recent call last): [ 663.828045] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.828045] nova-conductor[52216]: return func(*args, **kwargs) [ 663.828045] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.828045] nova-conductor[52216]: selections = self._select_destinations( [ 663.828045] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.828045] nova-conductor[52216]: selections = self._schedule( [ 663.828045] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.828045] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 663.828045] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.828045] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 663.828045] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.828045] nova-conductor[52216]: ERROR nova.conductor.manager [ 663.841548] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.843451] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 663.843451] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.914410] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] [instance: 2bb5005e-a95d-44a8-9097-8762fec094ed] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 663.914640] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.914856] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 663.915030] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.924691] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 663.924691] nova-conductor[52216]: Traceback (most recent call last): [ 663.924691] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.924691] nova-conductor[52216]: return func(*args, **kwargs) [ 663.924691] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.924691] nova-conductor[52216]: selections = self._select_destinations( [ 663.924691] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.924691] nova-conductor[52216]: selections = self._schedule( [ 663.924691] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.924691] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 663.924691] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.924691] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 663.924691] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.924691] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.925280] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a6ad3383-550a-40c5-857e-4403f3ed9c12 tempest-ServerDiagnosticsTest-1820856144 tempest-ServerDiagnosticsTest-1820856144-project-member] [instance: 2bb5005e-a95d-44a8-9097-8762fec094ed] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.442493] nova-conductor[52217]: Traceback (most recent call last): [ 668.442493] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.442493] nova-conductor[52217]: return func(*args, **kwargs) [ 668.442493] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.442493] nova-conductor[52217]: selections = self._select_destinations( [ 668.442493] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.442493] nova-conductor[52217]: selections = self._schedule( [ 668.442493] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.442493] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 668.442493] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.442493] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 668.442493] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.442493] nova-conductor[52217]: ERROR nova.conductor.manager [ 668.457990] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.457990] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.457990] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.518339] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] [instance: cd7db6db-62b8-44c9-a0d2-7de4b93625c0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 668.519936] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.519936] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.519936] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.525705] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 668.525705] nova-conductor[52217]: Traceback (most recent call last): [ 668.525705] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.525705] nova-conductor[52217]: return func(*args, **kwargs) [ 668.525705] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.525705] nova-conductor[52217]: selections = self._select_destinations( [ 668.525705] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.525705] nova-conductor[52217]: selections = self._schedule( [ 668.525705] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.525705] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 668.525705] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.525705] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 668.525705] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.525705] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.525705] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-8b59bad6-c84e-45f9-9be5-185b948c0e38 tempest-VolumesAdminNegativeTest-1360743485 tempest-VolumesAdminNegativeTest-1360743485-project-member] [instance: cd7db6db-62b8-44c9-a0d2-7de4b93625c0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 669.470962] nova-conductor[52216]: Traceback (most recent call last): [ 669.470962] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 669.470962] nova-conductor[52216]: return func(*args, **kwargs) [ 669.470962] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 669.470962] nova-conductor[52216]: selections = self._select_destinations( [ 669.470962] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 669.470962] nova-conductor[52216]: selections = self._schedule( [ 669.470962] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 669.470962] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 669.470962] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 669.470962] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 669.470962] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.470962] nova-conductor[52216]: ERROR nova.conductor.manager [ 669.480635] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.481046] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.481268] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 669.555642] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] [instance: e279efb7-5756-4c55-be2b-04bf2776cea4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 669.556969] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.557356] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.558330] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 669.561749] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 669.561749] nova-conductor[52216]: Traceback (most recent call last): [ 669.561749] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 669.561749] nova-conductor[52216]: return func(*args, **kwargs) [ 669.561749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 669.561749] nova-conductor[52216]: selections = self._select_destinations( [ 669.561749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 669.561749] nova-conductor[52216]: selections = self._schedule( [ 669.561749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 669.561749] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 669.561749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 669.561749] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 669.561749] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 669.561749] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 669.562233] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a878eeba-c333-45e2-9603-3b1e0f119917 tempest-ServersTestMultiNic-235525167 tempest-ServersTestMultiNic-235525167-project-member] [instance: e279efb7-5756-4c55-be2b-04bf2776cea4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.386736] nova-conductor[52217]: Traceback (most recent call last): [ 670.386736] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.386736] nova-conductor[52217]: return func(*args, **kwargs) [ 670.386736] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.386736] nova-conductor[52217]: selections = self._select_destinations( [ 670.386736] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.386736] nova-conductor[52217]: selections = self._schedule( [ 670.386736] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.386736] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 670.386736] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.386736] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 670.386736] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.386736] nova-conductor[52217]: ERROR nova.conductor.manager [ 670.403275] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.403511] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.403893] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.489281] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] [instance: 3fabb13b-1a95-4b65-8ace-46baf92ee929] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 670.490934] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.491238] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.491442] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.497681] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 670.497681] nova-conductor[52217]: Traceback (most recent call last): [ 670.497681] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.497681] nova-conductor[52217]: return func(*args, **kwargs) [ 670.497681] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.497681] nova-conductor[52217]: selections = self._select_destinations( [ 670.497681] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.497681] nova-conductor[52217]: selections = self._schedule( [ 670.497681] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.497681] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 670.497681] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.497681] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 670.497681] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.497681] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.498245] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-e45d9f28-9360-4b28-a11d-0c80b18c2a98 tempest-ServersAdminNegativeTestJSON-598870793 tempest-ServersAdminNegativeTestJSON-598870793-project-member] [instance: 3fabb13b-1a95-4b65-8ace-46baf92ee929] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 672.579792] nova-conductor[52216]: Traceback (most recent call last): [ 672.579792] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 672.579792] nova-conductor[52216]: return func(*args, **kwargs) [ 672.579792] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 672.579792] nova-conductor[52216]: selections = self._select_destinations( [ 672.579792] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 672.579792] nova-conductor[52216]: selections = self._schedule( [ 672.579792] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 672.579792] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 672.579792] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 672.579792] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 672.579792] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.579792] nova-conductor[52216]: ERROR nova.conductor.manager [ 672.592926] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.593192] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.593497] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.642394] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 98b908e7-70cf-412b-93d7-8fc790a9fda8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 672.642394] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.642394] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.642394] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.650460] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 672.650460] nova-conductor[52216]: Traceback (most recent call last): [ 672.650460] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 672.650460] nova-conductor[52216]: return func(*args, **kwargs) [ 672.650460] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 672.650460] nova-conductor[52216]: selections = self._select_destinations( [ 672.650460] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 672.650460] nova-conductor[52216]: selections = self._schedule( [ 672.650460] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 672.650460] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 672.650460] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 672.650460] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 672.650460] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 672.650460] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 672.654432] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-aee10459-57c4-45c4-9e85-4d778ba4b0af tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 98b908e7-70cf-412b-93d7-8fc790a9fda8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 678.258832] nova-conductor[52217]: Traceback (most recent call last): [ 678.258832] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 678.258832] nova-conductor[52217]: return func(*args, **kwargs) [ 678.258832] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 678.258832] nova-conductor[52217]: selections = self._select_destinations( [ 678.258832] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 678.258832] nova-conductor[52217]: selections = self._schedule( [ 678.258832] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 678.258832] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 678.258832] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 678.258832] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 678.258832] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: ERROR nova.conductor.manager [ 678.258832] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 678.258832] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.262082] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 678.345650] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: cfcc866e-bac3-4417-809d-0f00292731a5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 678.345650] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 678.345650] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.345650] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 678.351497] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 678.351497] nova-conductor[52217]: Traceback (most recent call last): [ 678.351497] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 678.351497] nova-conductor[52217]: return func(*args, **kwargs) [ 678.351497] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 678.351497] nova-conductor[52217]: selections = self._select_destinations( [ 678.351497] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 678.351497] nova-conductor[52217]: selections = self._schedule( [ 678.351497] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 678.351497] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 678.351497] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 678.351497] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 678.351497] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 678.351497] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 678.352501] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-49888db7-0528-4a14-b99e-1f63abfcf0b2 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: cfcc866e-bac3-4417-809d-0f00292731a5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 682.730570] nova-conductor[52217]: Traceback (most recent call last): [ 682.730570] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 682.730570] nova-conductor[52217]: return func(*args, **kwargs) [ 682.730570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 682.730570] nova-conductor[52217]: selections = self._select_destinations( [ 682.730570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 682.730570] nova-conductor[52217]: selections = self._schedule( [ 682.730570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 682.730570] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 682.730570] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 682.730570] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 682.730570] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.730570] nova-conductor[52217]: ERROR nova.conductor.manager [ 682.738565] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.738811] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.738946] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.788522] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] [instance: e0472520-c69a-44a8-bf3f-08689ee5c833] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 682.789452] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.789524] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.789930] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.793179] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 682.793179] nova-conductor[52217]: Traceback (most recent call last): [ 682.793179] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 682.793179] nova-conductor[52217]: return func(*args, **kwargs) [ 682.793179] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 682.793179] nova-conductor[52217]: selections = self._select_destinations( [ 682.793179] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 682.793179] nova-conductor[52217]: selections = self._schedule( [ 682.793179] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 682.793179] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 682.793179] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 682.793179] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 682.793179] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 682.793179] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 682.793751] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-01be598e-70ca-4d6c-90d4-7fddc7777332 tempest-SecurityGroupsTestJSON-2012520425 tempest-SecurityGroupsTestJSON-2012520425-project-member] [instance: e0472520-c69a-44a8-bf3f-08689ee5c833] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.164617] nova-conductor[52216]: Traceback (most recent call last): [ 684.164617] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 684.164617] nova-conductor[52216]: return func(*args, **kwargs) [ 684.164617] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 684.164617] nova-conductor[52216]: selections = self._select_destinations( [ 684.164617] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 684.164617] nova-conductor[52216]: selections = self._schedule( [ 684.164617] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 684.164617] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 684.164617] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 684.164617] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 684.164617] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.164617] nova-conductor[52216]: ERROR nova.conductor.manager [ 684.175844] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.176157] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.176348] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.237365] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] [instance: a3d05b8b-79b0-4b2c-abbc-242e81659a04] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 684.237988] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.238244] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.239356] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.249382] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 684.249382] nova-conductor[52216]: Traceback (most recent call last): [ 684.249382] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 684.249382] nova-conductor[52216]: return func(*args, **kwargs) [ 684.249382] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 684.249382] nova-conductor[52216]: selections = self._select_destinations( [ 684.249382] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 684.249382] nova-conductor[52216]: selections = self._schedule( [ 684.249382] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 684.249382] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 684.249382] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 684.249382] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 684.249382] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 684.249382] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.249945] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-466d45d8-a48e-4fe9-a074-95ecca19ca90 tempest-ServerRescueTestJSON-456204069 tempest-ServerRescueTestJSON-456204069-project-member] [instance: a3d05b8b-79b0-4b2c-abbc-242e81659a04] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.188794] nova-conductor[52217]: Traceback (most recent call last): [ 685.188794] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 685.188794] nova-conductor[52217]: return func(*args, **kwargs) [ 685.188794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 685.188794] nova-conductor[52217]: selections = self._select_destinations( [ 685.188794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 685.188794] nova-conductor[52217]: selections = self._schedule( [ 685.188794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 685.188794] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 685.188794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 685.188794] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 685.188794] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.188794] nova-conductor[52217]: ERROR nova.conductor.manager [ 685.203897] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.204465] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.204568] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.317524] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: cb250224-37da-4269-a5e8-16a27f26cfa0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 685.317912] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.318290] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.318595] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.325723] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 685.325723] nova-conductor[52217]: Traceback (most recent call last): [ 685.325723] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 685.325723] nova-conductor[52217]: return func(*args, **kwargs) [ 685.325723] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 685.325723] nova-conductor[52217]: selections = self._select_destinations( [ 685.325723] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 685.325723] nova-conductor[52217]: selections = self._schedule( [ 685.325723] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 685.325723] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 685.325723] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 685.325723] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 685.325723] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 685.325723] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 685.325723] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-e6b458e6-1262-44c3-b4df-a6342292dee0 tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: cb250224-37da-4269-a5e8-16a27f26cfa0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 686.876637] nova-conductor[52216]: Traceback (most recent call last): [ 686.876637] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 686.876637] nova-conductor[52216]: return func(*args, **kwargs) [ 686.876637] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 686.876637] nova-conductor[52216]: selections = self._select_destinations( [ 686.876637] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 686.876637] nova-conductor[52216]: selections = self._schedule( [ 686.876637] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 686.876637] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 686.876637] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 686.876637] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 686.876637] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.876637] nova-conductor[52216]: ERROR nova.conductor.manager [ 686.885433] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.885766] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.885866] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 686.930757] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: afc09864-67e5-4104-931f-4b18b90c69b1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 686.931367] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.931689] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.931771] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 686.934942] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 686.934942] nova-conductor[52216]: Traceback (most recent call last): [ 686.934942] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 686.934942] nova-conductor[52216]: return func(*args, **kwargs) [ 686.934942] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 686.934942] nova-conductor[52216]: selections = self._select_destinations( [ 686.934942] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 686.934942] nova-conductor[52216]: selections = self._schedule( [ 686.934942] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 686.934942] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 686.934942] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 686.934942] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 686.934942] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 686.934942] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 686.935546] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-1af3df5e-e50c-49ba-97cf-4b633f78ce71 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: afc09864-67e5-4104-931f-4b18b90c69b1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.085321] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 688.105116] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.105436] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.105553] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.146829] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.147085] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.147262] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.147637] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.147822] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.147981] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.161366] nova-conductor[52216]: DEBUG nova.quota [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Getting quotas for project 6313f0295f5f49a8a507833c28e32831. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 688.164212] nova-conductor[52216]: DEBUG nova.quota [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Getting quotas for user c8b1d847b0d24854b7d6dae9b1e0d809 and project 6313f0295f5f49a8a507833c28e32831. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 688.170052] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 688.170473] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.172078] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.172078] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.173720] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 688.174377] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.174580] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.174745] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.187247] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.187472] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.187642] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.247395] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 690.261901] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.262153] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.262328] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.301935] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.302312] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.304539] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.304539] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.304539] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.304539] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.315694] nova-conductor[52217]: DEBUG nova.quota [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting quotas for project cc4c8738af2b48f981e5f2feadb41a59. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 690.325225] nova-conductor[52217]: DEBUG nova.quota [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting quotas for user 6316c8b7da8d4d3c97b2693b33729c52 and project cc4c8738af2b48f981e5f2feadb41a59. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 690.333366] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 690.333366] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.333675] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.333960] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.337382] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 690.338251] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.342329] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.004s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.342536] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.356893] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.357163] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.357335] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.260532] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Took 0.23 seconds to select destinations for 2 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 691.274024] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.274024] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.274024] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.304941] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.305178] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.306266] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.337142] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.338319] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.338549] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.339703] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.339703] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.339703] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.356861] nova-conductor[52216]: DEBUG nova.quota [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting quotas for project 1de6a55b95aa4af2865ec70142a20326. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 691.361301] nova-conductor[52216]: DEBUG nova.quota [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting quotas for user bfbe2267ded84c71b3af181cb852d581 and project 1de6a55b95aa4af2865ec70142a20326. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 691.369294] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 691.370135] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.370135] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.370227] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.374496] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 691.375354] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.375572] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.375743] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.393792] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.393792] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.393792] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.399939] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 691.400571] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.400776] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.400939] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.405473] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 691.405651] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.406195] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.406195] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.422581] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.422820] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.422994] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.386042] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 692.396067] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.396507] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.396507] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.438158] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.438158] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.438158] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.438296] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.438463] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.438624] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.450849] nova-conductor[52217]: DEBUG nova.quota [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Getting quotas for project 6feb8151289a4819ada69e3e87e3e27e. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 692.454825] nova-conductor[52217]: DEBUG nova.quota [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Getting quotas for user fea9ad8b418d4797b0374a3b8da2cd95 and project 6feb8151289a4819ada69e3e87e3e27e. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 692.459340] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 692.460609] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.460609] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.460609] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.466425] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] block_device_mapping [BlockDeviceMapping(attachment_id=d5344833-600a-46fe-8eaf-0b2c1cb157a1,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='070f46bc-c5c6-4781-b1c7-8b7202fb2254',volume_size=1,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 692.467139] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.467351] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.467678] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.487875] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.487875] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.488038] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.245666] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 693.259900] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.260741] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.260830] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.298710] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.298917] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.299707] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.299707] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.299707] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.300090] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.317537] nova-conductor[52217]: DEBUG nova.quota [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Getting quotas for project ed29a28397c34c4491790f867effce4e. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 693.318746] nova-conductor[52217]: DEBUG nova.quota [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Getting quotas for user 63112604b7d94a4aa441652bfcb9aed6 and project ed29a28397c34c4491790f867effce4e. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 693.327558] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 693.328080] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.328286] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.328537] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.331606] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 693.332276] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.332474] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.332640] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.349195] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.349378] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.351055] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.674723] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 694.689848] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.690085] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.690255] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.719364] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.719841] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.719841] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.720069] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.720259] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.720413] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.728941] nova-conductor[52217]: DEBUG nova.quota [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Getting quotas for project 61af4231bb9c4fc2a2d742b7c3d1db40. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 694.731258] nova-conductor[52217]: DEBUG nova.quota [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Getting quotas for user db368a78ac5245d6a869b37de0fc1d2e and project 61af4231bb9c4fc2a2d742b7c3d1db40. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 694.737238] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 694.737732] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.738056] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.738219] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.742965] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 694.743774] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.743806] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.743954] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.756507] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.756715] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.756884] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.532660] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 695.544687] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.545734] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.545734] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.572824] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.572824] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.573075] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.573331] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.573503] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.573802] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.583050] nova-conductor[52216]: DEBUG nova.quota [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Getting quotas for project 9d1b4970d4cd420e8fa3c599f48248e3. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 695.585376] nova-conductor[52216]: DEBUG nova.quota [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Getting quotas for user 1c7cb7edb3ad470ab8eb7b47954960f9 and project 9d1b4970d4cd420e8fa3c599f48248e3. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 695.590849] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 695.591328] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.591520] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.591677] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.594603] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] block_device_mapping [BlockDeviceMapping(attachment_id=1fe774a7-ab8c-4f22-8f2d-8f943554a512,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='fc3e203c-1f3f-4ffd-8eb7-15cdd302b995',volume_size=1,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 695.595256] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.596043] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.596043] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.608150] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.608376] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.608547] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.011892] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 696.025968] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.026243] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.026422] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.078026] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.078026] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.078026] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.078026] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.078026] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.078026] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.091598] nova-conductor[52217]: DEBUG nova.quota [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Getting quotas for project 29c14b15b48749a9bf540ab72ee49451. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 696.094757] nova-conductor[52217]: DEBUG nova.quota [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Getting quotas for user 8af6208ef2254718bec125cc1bc85039 and project 29c14b15b48749a9bf540ab72ee49451. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 696.103849] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 696.103849] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.103849] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.104263] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.107177] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 696.107834] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.108051] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.108222] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.126602] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.127082] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.127082] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.296948] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 698.308042] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.308637] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.308937] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.341547] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.341739] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.341907] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.342261] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.342440] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.342598] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.350988] nova-conductor[52216]: DEBUG nova.quota [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Getting quotas for project 035fb02e7d5e4870a9853822e21bff7b. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 698.353254] nova-conductor[52216]: DEBUG nova.quota [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Getting quotas for user 7b24e88854ce4efb81412f81ab12f923 and project 035fb02e7d5e4870a9853822e21bff7b. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 698.363233] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 698.363699] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.363899] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.364074] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.368612] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 698.369258] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.369457] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.369624] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.388019] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.388019] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.388019] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.715825] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 763e888f-2290-4ca2-ab8e-703450a21e35 was re-scheduled: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information.\n'] [ 698.716828] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 698.716828] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 763e888f-2290-4ca2-ab8e-703450a21e35.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 763e888f-2290-4ca2-ab8e-703450a21e35. [ 698.717012] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 763e888f-2290-4ca2-ab8e-703450a21e35. [ 698.739146] nova-conductor[52216]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 698.807883] nova-conductor[52216]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 698.810774] nova-conductor[52216]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 699.085020] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 699.096171] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.096622] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.096932] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.130223] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.131013] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.133017] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.133017] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.133017] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.133017] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.143611] nova-conductor[52217]: DEBUG nova.quota [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Getting quotas for project 734b5ead550e4b53b20700d2f870a662. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 699.146263] nova-conductor[52217]: DEBUG nova.quota [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Getting quotas for user 47a15abf5130436b9bb35961355b2452 and project 734b5ead550e4b53b20700d2f870a662. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 699.155025] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 699.155669] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.156056] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.156343] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.159695] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 699.160515] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.161878] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.161878] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.177957] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.178189] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.178352] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.920133] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 699.932774] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.933050] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.933202] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.972655] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.972878] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.973064] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.973425] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.973608] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.973923] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.985344] nova-conductor[52216]: DEBUG nova.quota [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Getting quotas for project 296603b1512a48abb6f0d69cb85ac9da. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 699.988833] nova-conductor[52216]: DEBUG nova.quota [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Getting quotas for user 6f9c6ae878974ba680acf8fae4e0f078 and project 296603b1512a48abb6f0d69cb85ac9da. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 699.995513] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 699.996069] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.996307] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.996479] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.999517] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 699.999975] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.000188] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.000350] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.022592] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.022805] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.022978] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.121129] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 701.138549] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.138771] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.138946] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.178340] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.178567] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.178736] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.179139] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.179384] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.179603] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.189940] nova-conductor[52216]: DEBUG nova.quota [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Getting quotas for project 348e51e63ac440b1965e88712c05a3f1. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 701.192257] nova-conductor[52216]: DEBUG nova.quota [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Getting quotas for user f1da4498087e480aa7d065697ca021a7 and project 348e51e63ac440b1965e88712c05a3f1. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 701.198864] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 701.199191] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.199388] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.199553] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.206538] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 701.207378] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.207378] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.207547] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.238622] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.238845] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.239029] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.448393] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0 was re-scheduled: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information.\n'] [ 701.448393] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 701.448393] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0. [ 701.452018] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0. [ 701.473466] nova-conductor[52216]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 701.562881] nova-conductor[52216]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.566091] nova-conductor[52216]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.978373] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 702.996379] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.996610] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.996780] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.026212] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.026441] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.026709] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.026956] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.027155] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.027313] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.036247] nova-conductor[52216]: DEBUG nova.quota [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Getting quotas for project 61af4231bb9c4fc2a2d742b7c3d1db40. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 703.038689] nova-conductor[52216]: DEBUG nova.quota [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Getting quotas for user db368a78ac5245d6a869b37de0fc1d2e and project 61af4231bb9c4fc2a2d742b7c3d1db40. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 703.049817] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 703.049817] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.049995] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.050180] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.058071] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 703.058071] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.058071] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.058071] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.071082] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.071305] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.071471] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.554438] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 9807a449-4cca-416c-815d-99d5bc674464 was re-scheduled: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information.\n'] [ 703.555797] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 703.555797] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9807a449-4cca-416c-815d-99d5bc674464.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9807a449-4cca-416c-815d-99d5bc674464. [ 703.556202] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9807a449-4cca-416c-815d-99d5bc674464. [ 703.607362] nova-conductor[52216]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 703.667026] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 8aec9e05-7685-4895-b375-6f5cd45e7a5f was re-scheduled: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information.\n'] [ 703.667026] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 703.667026] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8aec9e05-7685-4895-b375-6f5cd45e7a5f.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8aec9e05-7685-4895-b375-6f5cd45e7a5f. [ 703.667026] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8aec9e05-7685-4895-b375-6f5cd45e7a5f. [ 703.692064] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 703.823760] nova-conductor[52216]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.827723] nova-conductor[52216]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.838108] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.840574] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.332293] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.332293] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.332293] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.522749] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 704.547589] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.547867] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.548921] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.578363] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.578633] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.578779] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.579136] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.579316] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.579474] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.588336] nova-conductor[52216]: DEBUG nova.quota [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Getting quotas for project c2fd30bdaf134bbcb4e8e9840b095a9e. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 704.590701] nova-conductor[52216]: DEBUG nova.quota [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Getting quotas for user 5858f0bb101c4a7dbd17a249240141be and project c2fd30bdaf134bbcb4e8e9840b095a9e. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 704.599538] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 704.599538] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.599538] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.599538] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.601946] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 704.602660] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.602857] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.603034] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.619420] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.619614] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.619757] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.759545] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 1015f7da-bc69-489b-bb38-b31c1fe919a8 was re-scheduled: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information.\n'] [ 705.760150] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 705.760378] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1015f7da-bc69-489b-bb38-b31c1fe919a8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1015f7da-bc69-489b-bb38-b31c1fe919a8. [ 705.760588] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1015f7da-bc69-489b-bb38-b31c1fe919a8. [ 705.784156] nova-conductor[52217]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 705.989226] nova-conductor[52217]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.995668] nova-conductor[52217]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.243054] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33 was re-scheduled: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information.\n'] [ 706.243054] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 706.243054] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33. [ 706.243054] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33. [ 706.295444] nova-conductor[52216]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 706.377544] nova-conductor[52216]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.381312] nova-conductor[52216]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.897653] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance d65656c5-2cdb-4152-8e47-20d182d39c7a was re-scheduled: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information.\n'] [ 707.898775] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 707.899133] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d65656c5-2cdb-4152-8e47-20d182d39c7a.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d65656c5-2cdb-4152-8e47-20d182d39c7a. [ 707.900047] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d65656c5-2cdb-4152-8e47-20d182d39c7a. [ 707.925032] nova-conductor[52216]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 707.976972] nova-conductor[52216]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.981578] nova-conductor[52216]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.978547] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 630acd3e-e4e3-483b-984c-7023fd8c77d5 was re-scheduled: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information.\n'] [ 708.979177] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 708.979393] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 630acd3e-e4e3-483b-984c-7023fd8c77d5.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 630acd3e-e4e3-483b-984c-7023fd8c77d5. [ 708.979607] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 630acd3e-e4e3-483b-984c-7023fd8c77d5. [ 709.005028] nova-conductor[52216]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 709.022494] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 709.049275] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.049510] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.049676] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.069900] nova-conductor[52216]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 709.076418] nova-conductor[52216]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.095311] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.095531] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.095693] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.096053] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.096225] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.096519] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.106419] nova-conductor[52217]: DEBUG nova.quota [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Getting quotas for project 15e5a73b9fc04caaa60d4dcc2f8b0380. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 709.109137] nova-conductor[52217]: DEBUG nova.quota [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Getting quotas for user 65bfc207577b49578dd11f62fa61e23e and project 15e5a73b9fc04caaa60d4dcc2f8b0380. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 709.131500] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 709.131900] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.132426] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.132507] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.137073] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 709.137713] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.138033] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.138092] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.164998] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.165239] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.165404] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.208382] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 06b62938-99d6-43a1-af87-aced894bc8d8 was re-scheduled: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information.\n'] [ 709.208858] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 709.209099] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 06b62938-99d6-43a1-af87-aced894bc8d8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 06b62938-99d6-43a1-af87-aced894bc8d8. [ 709.210410] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 06b62938-99d6-43a1-af87-aced894bc8d8. [ 709.234775] nova-conductor[52216]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 709.486204] nova-conductor[52216]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 709.492874] nova-conductor[52216]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.071478] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 710.083704] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.086532] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.086532] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.114734] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.115292] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.115738] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.116385] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.117049] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.117434] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.127957] nova-conductor[52216]: DEBUG nova.quota [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Getting quotas for project 57e287820a334ab58ec0c42d68339b66. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 710.131555] nova-conductor[52216]: DEBUG nova.quota [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Getting quotas for user 49ac2f7c40cb472d9b9678a537ea011f and project 57e287820a334ab58ec0c42d68339b66. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 710.138447] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 710.138646] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.138843] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.139023] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.142091] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 710.142840] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.143049] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.143220] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.157417] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.157628] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.157804] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.925951] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 3f53c35f-40ea-4094-89e2-624b156e5560 was re-scheduled: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information.\n'] [ 711.926579] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 711.926891] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3f53c35f-40ea-4094-89e2-624b156e5560.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3f53c35f-40ea-4094-89e2-624b156e5560. [ 711.927132] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3f53c35f-40ea-4094-89e2-624b156e5560. [ 711.951434] nova-conductor[52216]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 711.989401] nova-conductor[52216]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.994474] nova-conductor[52216]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.298535] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 588eb672-6240-46cd-8e93-b38c9e2829bf was re-scheduled: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information.\n'] [ 712.300036] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 712.300036] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 588eb672-6240-46cd-8e93-b38c9e2829bf.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 588eb672-6240-46cd-8e93-b38c9e2829bf. [ 712.300036] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 588eb672-6240-46cd-8e93-b38c9e2829bf. [ 712.321234] nova-conductor[52217]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 712.357970] nova-conductor[52217]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 712.362786] nova-conductor[52217]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.105126] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance e31d29d1-c49c-4696-85c9-11cb985a7bfd was re-scheduled: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information.\n'] [ 713.105126] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 713.105126] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e31d29d1-c49c-4696-85c9-11cb985a7bfd.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e31d29d1-c49c-4696-85c9-11cb985a7bfd. [ 713.105420] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e31d29d1-c49c-4696-85c9-11cb985a7bfd. [ 713.129560] nova-conductor[52217]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.188879] nova-conductor[52217]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.192241] nova-conductor[52217]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.541827] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 713.560575] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.560848] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.561094] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.598271] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.598551] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.598788] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.599181] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.599394] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.599599] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.611637] nova-conductor[52217]: DEBUG nova.quota [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Getting quotas for project bd8ec082967e4839810b492d2af942a6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 713.615028] nova-conductor[52217]: DEBUG nova.quota [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Getting quotas for user f40b2254538745bf9406408a158a55fd and project bd8ec082967e4839810b492d2af942a6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 713.622347] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 713.622866] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.623414] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.623706] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.629698] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 713.629698] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.629698] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.629698] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.646224] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.646440] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.646609] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.041283] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 16b35372-2e84-4f6c-ab01-fcbc86e9cca0 was re-scheduled: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information.\n'] [ 714.041533] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 714.041712] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 16b35372-2e84-4f6c-ab01-fcbc86e9cca0.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 16b35372-2e84-4f6c-ab01-fcbc86e9cca0. [ 714.044083] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 16b35372-2e84-4f6c-ab01-fcbc86e9cca0. [ 714.081824] nova-conductor[52217]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 714.124729] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 714.146184] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.146590] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.146590] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.154504] nova-conductor[52217]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 714.159408] nova-conductor[52217]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.190911] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.191037] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.191225] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.191560] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.191738] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.191897] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.201850] nova-conductor[52216]: DEBUG nova.quota [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting quotas for project cc4c8738af2b48f981e5f2feadb41a59. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 714.202829] nova-conductor[52216]: DEBUG nova.quota [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting quotas for user 6316c8b7da8d4d3c97b2693b33729c52 and project cc4c8738af2b48f981e5f2feadb41a59. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 714.217292] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 714.217671] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.217910] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.218146] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.224806] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 714.225441] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.225642] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.225809] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.245146] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.245146] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.245438] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.693997] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance e89d07fc-9c98-4352-b609-c7fde7ee0d39 was re-scheduled: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information.\n'] [ 715.694635] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 715.694866] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e89d07fc-9c98-4352-b609-c7fde7ee0d39.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e89d07fc-9c98-4352-b609-c7fde7ee0d39. [ 715.695197] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e89d07fc-9c98-4352-b609-c7fde7ee0d39. [ 715.719009] nova-conductor[52216]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 715.786445] nova-conductor[52216]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.790433] nova-conductor[52216]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.957497] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 6fdadbc2-14e5-440f-aba2-4db693f56de6 was re-scheduled: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information.\n'] [ 718.957497] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 718.958123] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6fdadbc2-14e5-440f-aba2-4db693f56de6.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6fdadbc2-14e5-440f-aba2-4db693f56de6. [ 718.958183] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6fdadbc2-14e5-440f-aba2-4db693f56de6. [ 718.981390] nova-conductor[52216]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 719.052018] nova-conductor[52216]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.057324] nova-conductor[52216]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.144832] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Took 0.18 seconds to select destinations for 2 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 720.156557] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.156781] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.156951] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.186886] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.187134] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.187308] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.218072] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.218313] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.218478] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.218997] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.218997] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.219191] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.228516] nova-conductor[52217]: DEBUG nova.quota [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting quotas for project 1de6a55b95aa4af2865ec70142a20326. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 720.230939] nova-conductor[52217]: DEBUG nova.quota [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting quotas for user bfbe2267ded84c71b3af181cb852d581 and project 1de6a55b95aa4af2865ec70142a20326. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 720.236582] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 720.236969] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.237196] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.237363] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.240616] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 720.241278] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.241472] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.241631] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.255973] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.256202] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.256344] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.262914] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 720.263389] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.263593] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.263759] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.267147] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 720.267790] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.267988] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.268163] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.284582] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.284888] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.284970] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.298048] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 73b2fd88-ded1-4a92-a973-6a49e57faa5e was re-scheduled: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information.\n'] [ 721.298697] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 721.298926] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 73b2fd88-ded1-4a92-a973-6a49e57faa5e.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 73b2fd88-ded1-4a92-a973-6a49e57faa5e. [ 721.299258] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 73b2fd88-ded1-4a92-a973-6a49e57faa5e. [ 721.322101] nova-conductor[52217]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 721.566456] nova-conductor[52217]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.571374] nova-conductor[52217]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.121425] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 723.137467] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.137467] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.137467] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.173273] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.173426] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.173624] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.174094] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.174162] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.174324] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.189235] nova-conductor[52216]: DEBUG nova.quota [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Getting quotas for project 703100dfccf3406aa1580f39e87cdbb1. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 723.191482] nova-conductor[52216]: DEBUG nova.quota [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Getting quotas for user e41f52cd69cb4095bf97438cf32fc145 and project 703100dfccf3406aa1580f39e87cdbb1. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 723.200919] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 723.201384] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.201602] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.201764] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.205946] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 723.206628] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.206828] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.206989] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.225945] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.226968] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.226968] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.609729] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c was re-scheduled: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information.\n'] [ 725.610429] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 725.610624] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c. [ 725.610827] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c. [ 725.669774] nova-conductor[52217]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 725.700717] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 2e990d70-8e51-4900-9d9d-db920311a8ab was re-scheduled: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information.\n'] [ 725.701402] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 725.701635] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2e990d70-8e51-4900-9d9d-db920311a8ab.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2e990d70-8e51-4900-9d9d-db920311a8ab. [ 725.701938] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 2e990d70-8e51-4900-9d9d-db920311a8ab. [ 725.730738] nova-conductor[52216]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 725.738421] nova-conductor[52217]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.742211] nova-conductor[52217]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.009928] nova-conductor[52216]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 726.016447] nova-conductor[52216]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 726.398948] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 726.411624] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.411940] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.412126] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.443905] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.446025] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.446025] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.446025] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.446025] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.446025] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.459882] nova-conductor[52216]: DEBUG nova.quota [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Getting quotas for project 69518eb542f84a3b9ce9cb5a4f266c9a. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 726.463316] nova-conductor[52216]: DEBUG nova.quota [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Getting quotas for user 666cd8bcdfa84942bc4ea57bf9ad5a00 and project 69518eb542f84a3b9ce9cb5a4f266c9a. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 726.470061] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 726.470393] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.470631] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.470826] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.476998] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 726.477719] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.477857] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.478021] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.491757] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.492367] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.492367] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.226854] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 731.255069] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.255069] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.255069] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.288487] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.288708] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.288893] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.289267] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.289454] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.289621] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.298753] nova-conductor[52217]: DEBUG nova.quota [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Getting quotas for project bd8ec082967e4839810b492d2af942a6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 731.301341] nova-conductor[52217]: DEBUG nova.quota [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Getting quotas for user f40b2254538745bf9406408a158a55fd and project bd8ec082967e4839810b492d2af942a6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 731.308412] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 731.308874] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.309085] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.309250] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.312716] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 731.312900] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.313082] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.313243] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.330023] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 449cb7ea-c7e9-411c-9c09-f451d892d32c was re-scheduled: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information.\n'] [ 731.330296] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 731.330532] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 449cb7ea-c7e9-411c-9c09-f451d892d32c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 449cb7ea-c7e9-411c-9c09-f451d892d32c. [ 731.330736] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 449cb7ea-c7e9-411c-9c09-f451d892d32c. [ 731.335591] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.335814] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.335955] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.367268] nova-conductor[52217]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.563054] nova-conductor[52217]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.573228] nova-conductor[52217]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.625329] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 734.643290] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.643290] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.643290] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.673496] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.673710] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.673868] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.674228] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.674642] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.674642] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.688899] nova-conductor[52216]: DEBUG nova.quota [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Getting quotas for project 035fb02e7d5e4870a9853822e21bff7b. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 734.697907] nova-conductor[52216]: DEBUG nova.quota [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Getting quotas for user 7b24e88854ce4efb81412f81ab12f923 and project 035fb02e7d5e4870a9853822e21bff7b. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 734.707464] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 734.707963] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.708183] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.708343] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.716110] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 734.716339] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.716339] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.716493] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.737586] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.737586] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.737965] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.721999] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 9999064a-7edc-4e2c-92fb-7a713194764c was re-scheduled: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information.\n'] [ 736.722655] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 736.722835] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9999064a-7edc-4e2c-92fb-7a713194764c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9999064a-7edc-4e2c-92fb-7a713194764c. [ 736.723586] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9999064a-7edc-4e2c-92fb-7a713194764c. [ 736.753713] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.758131] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 736.771789] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.771989] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.772166] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.810156] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.813247] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.813247] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.813247] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.813247] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.813247] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.823205] nova-conductor[52217]: DEBUG nova.quota [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Getting quotas for project 8863ad7f29dc48a59a77f57a53f759c6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 736.826576] nova-conductor[52217]: DEBUG nova.quota [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Getting quotas for user aae8e6cff59a4db1aca1c4adc48cb289 and project 8863ad7f29dc48a59a77f57a53f759c6. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 736.836355] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.837231] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.838508] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 736.838946] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.839160] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.839320] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.843221] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 736.843877] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.844098] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.845398] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.872969] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.872969] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.873103] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.974454] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance a1f06df7-a38c-431d-98ee-7b3df8224ea1 was re-scheduled: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information.\n'] [ 737.975089] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 737.975323] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a1f06df7-a38c-431d-98ee-7b3df8224ea1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a1f06df7-a38c-431d-98ee-7b3df8224ea1. [ 737.975622] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a1f06df7-a38c-431d-98ee-7b3df8224ea1. [ 738.003592] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 738.050876] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.058339] nova-conductor[52217]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.221784] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 738.241539] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.241797] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.241967] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.282839] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.283069] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.283233] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.283569] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.283741] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.283889] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.293796] nova-conductor[52216]: DEBUG nova.quota [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Getting quotas for project 51ad16571493443e908ec396deddcdfb. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 738.296564] nova-conductor[52216]: DEBUG nova.quota [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Getting quotas for user b8c7ae6f85f24cc38bd0dfb4e56f713c and project 51ad16571493443e908ec396deddcdfb. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 738.302295] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 738.302883] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.303145] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.303287] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.307020] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 738.307020] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.307324] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.307324] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.322107] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.322107] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.322107] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.583365] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 739.601939] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.602933] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.603280] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.646072] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.646316] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.646494] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.646841] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.647306] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.647306] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.671960] nova-conductor[52216]: DEBUG nova.quota [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting quotas for project cc4c8738af2b48f981e5f2feadb41a59. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 739.674704] nova-conductor[52216]: DEBUG nova.quota [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting quotas for user 6316c8b7da8d4d3c97b2693b33729c52 and project cc4c8738af2b48f981e5f2feadb41a59. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 739.681683] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 739.682195] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.682402] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.682568] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.686262] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 739.687270] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.687490] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.687652] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.704404] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.704608] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.704774] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.503708] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 740.528759] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.529048] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.003s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.529580] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.591349] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.592239] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.002s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.592586] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.593463] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.597334] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.597334] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.616201] nova-conductor[52216]: DEBUG nova.quota [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Getting quotas for project 03e26d763fe749bebf9aab124185bffe. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 740.617903] nova-conductor[52216]: DEBUG nova.quota [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Getting quotas for user 2e6de82121c64c36a180db6eb2734ee2 and project 03e26d763fe749bebf9aab124185bffe. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 740.635107] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 740.635107] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.635107] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.635229] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.642038] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 740.642768] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.642768] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.642880] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.662364] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.662579] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.665607] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.752911] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 904eb823-1bb4-48b1-8460-c722cbc4652c was re-scheduled: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information.\n'] [ 740.753590] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 740.753823] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 904eb823-1bb4-48b1-8460-c722cbc4652c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 904eb823-1bb4-48b1-8460-c722cbc4652c. [ 740.754061] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 904eb823-1bb4-48b1-8460-c722cbc4652c. [ 740.799173] nova-conductor[52217]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 741.034975] nova-conductor[52217]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.043714] nova-conductor[52217]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.398592] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77 was re-scheduled: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information.\n'] [ 741.399234] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 741.399459] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77. [ 741.399764] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77. [ 741.429821] nova-conductor[52216]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 741.472759] nova-conductor[52216]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.476070] nova-conductor[52216]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.448836] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Took 0.28 seconds to select destinations for 3 instance(s). {{(pid=52217) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 743.461829] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.462073] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.462240] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.492982] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.492982] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.492982] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.537743] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.537743] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.537743] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.569189] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.569453] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.569669] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.570103] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.570301] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.570521] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.581551] nova-conductor[52217]: DEBUG nova.quota [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Getting quotas for project c4efa8e2dcbc492ea32aa20745286b46. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 743.587790] nova-conductor[52217]: DEBUG nova.quota [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Getting quotas for user 6e7b7c11c6f14d82aae4d866a76aaa73 and project c4efa8e2dcbc492ea32aa20745286b46. Resources: {'cores', 'instances', 'ram'} {{(pid=52217) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 743.591241] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 743.592050] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.592310] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.592516] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.595307] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 743.596657] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.596657] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.596657] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.608919] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.609178] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.609347] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.616825] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 743.616825] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.616825] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.616825] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.620592] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 743.621560] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.621560] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.621711] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.681256] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.681994] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.681994] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.691364] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52217) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 743.691726] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.691961] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.692170] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.695759] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 743.696472] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.696721] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.696919] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.727637] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.727879] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.728026] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.851448] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52216) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 744.867329] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.867440] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.867599] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.906983] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.907440] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.907792] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.908302] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.910254] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.910254] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.920684] nova-conductor[52216]: DEBUG nova.quota [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Getting quotas for project 7415f43b7da842daa77f717471dd89ac. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 744.923700] nova-conductor[52216]: DEBUG nova.quota [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Getting quotas for user e8748e6b463047f7b05683169bf0a0e7 and project 7415f43b7da842daa77f717471dd89ac. Resources: {'cores', 'instances', 'ram'} {{(pid=52216) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 744.930379] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52216) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 744.931233] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.931578] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.931978] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.939497] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 744.939497] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.939649] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.939707] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.955604] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.955820] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.956049] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.027459] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae was re-scheduled: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information.\n'] [ 748.027940] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 748.027940] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae. [ 748.028232] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae. [ 748.080467] nova-conductor[52216]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 748.348119] nova-conductor[52216]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.350089] nova-conductor[52216]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 749.520213] nova-conductor[52217]: Traceback (most recent call last): [ 749.520213] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 749.520213] nova-conductor[52217]: return func(*args, **kwargs) [ 749.520213] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 749.520213] nova-conductor[52217]: selections = self._select_destinations( [ 749.520213] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 749.520213] nova-conductor[52217]: selections = self._schedule( [ 749.520213] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 749.520213] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 749.520213] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 749.520213] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 749.520213] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.520213] nova-conductor[52217]: ERROR nova.conductor.manager [ 749.529024] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.529024] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.529024] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.611418] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: ac4c9944-8ce3-4c29-892f-e0024006a209] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 749.611418] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.611418] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.611418] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.616549] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 749.616549] nova-conductor[52217]: Traceback (most recent call last): [ 749.616549] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 749.616549] nova-conductor[52217]: return func(*args, **kwargs) [ 749.616549] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 749.616549] nova-conductor[52217]: selections = self._select_destinations( [ 749.616549] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 749.616549] nova-conductor[52217]: selections = self._schedule( [ 749.616549] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 749.616549] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 749.616549] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 749.616549] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 749.616549] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 749.616549] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 749.616549] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c87dca61-d8e6-4dec-b256-f75494f2816a tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: ac4c9944-8ce3-4c29-892f-e0024006a209] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 750.277270] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance d85db5e9-ce70-477d-bb5c-7665ab69b19a was re-scheduled: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information.\n'] [ 750.277842] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 750.278461] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d85db5e9-ce70-477d-bb5c-7665ab69b19a.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d85db5e9-ce70-477d-bb5c-7665ab69b19a. [ 750.278717] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d85db5e9-ce70-477d-bb5c-7665ab69b19a. [ 750.311172] nova-conductor[52217]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 750.383158] nova-conductor[52217]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.386642] nova-conductor[52217]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.680649] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 7b5558e4-05fc-4755-accf-77228272884f was re-scheduled: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information.\n'] [ 750.681227] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 750.681647] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7b5558e4-05fc-4755-accf-77228272884f.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7b5558e4-05fc-4755-accf-77228272884f. [ 750.681754] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7b5558e4-05fc-4755-accf-77228272884f. [ 750.705571] nova-conductor[52216]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 750.755430] nova-conductor[52216]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.758899] nova-conductor[52216]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.525941] nova-conductor[52216]: Traceback (most recent call last): [ 751.525941] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.525941] nova-conductor[52216]: return func(*args, **kwargs) [ 751.525941] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.525941] nova-conductor[52216]: selections = self._select_destinations( [ 751.525941] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.525941] nova-conductor[52216]: selections = self._schedule( [ 751.525941] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.525941] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 751.525941] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.525941] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 751.525941] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.525941] nova-conductor[52216]: ERROR nova.conductor.manager [ 751.532889] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.533147] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.533316] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.634791] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: 4952dc46-6eac-4d35-a06b-a5587abc5493] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 751.635526] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.635741] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.636059] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.638921] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 751.638921] nova-conductor[52216]: Traceback (most recent call last): [ 751.638921] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.638921] nova-conductor[52216]: return func(*args, **kwargs) [ 751.638921] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.638921] nova-conductor[52216]: selections = self._select_destinations( [ 751.638921] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.638921] nova-conductor[52216]: selections = self._schedule( [ 751.638921] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.638921] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 751.638921] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.638921] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 751.638921] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.638921] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.639464] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-0e4294c3-bc20-4080-a69a-7e61d055ba28 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: 4952dc46-6eac-4d35-a06b-a5587abc5493] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.190455] nova-conductor[52217]: Traceback (most recent call last): [ 754.190455] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 754.190455] nova-conductor[52217]: return func(*args, **kwargs) [ 754.190455] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 754.190455] nova-conductor[52217]: selections = self._select_destinations( [ 754.190455] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 754.190455] nova-conductor[52217]: selections = self._schedule( [ 754.190455] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 754.190455] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 754.190455] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 754.190455] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 754.190455] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.190455] nova-conductor[52217]: ERROR nova.conductor.manager [ 754.199339] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.201504] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.201504] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.274726] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: b68395b8-a6d3-4d0c-9ed8-6e62c853a700] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 754.275443] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.275646] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.275806] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.279011] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 754.279011] nova-conductor[52217]: Traceback (most recent call last): [ 754.279011] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 754.279011] nova-conductor[52217]: return func(*args, **kwargs) [ 754.279011] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 754.279011] nova-conductor[52217]: selections = self._select_destinations( [ 754.279011] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 754.279011] nova-conductor[52217]: selections = self._schedule( [ 754.279011] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 754.279011] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 754.279011] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 754.279011] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 754.279011] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 754.279011] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.279536] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-6c83fab6-36ef-4265-9cfc-9837bc883f85 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: b68395b8-a6d3-4d0c-9ed8-6e62c853a700] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.713355] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance fea7d2f4-199d-4c76-84cd-4ee7820990ec was re-scheduled: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information.\n'] [ 754.714475] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 754.715495] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance fea7d2f4-199d-4c76-84cd-4ee7820990ec.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance fea7d2f4-199d-4c76-84cd-4ee7820990ec. [ 754.715735] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance fea7d2f4-199d-4c76-84cd-4ee7820990ec. [ 754.747954] nova-conductor[52217]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 754.826542] nova-conductor[52217]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.833979] nova-conductor[52217]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.230406] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 5c21177e-6cff-414f-bff1-bac166929cab was re-scheduled: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information.\n'] [ 755.230406] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 755.230406] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5c21177e-6cff-414f-bff1-bac166929cab.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5c21177e-6cff-414f-bff1-bac166929cab. [ 755.230406] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5c21177e-6cff-414f-bff1-bac166929cab. [ 755.253602] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.260273] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 671639d6-3103-4eeb-86d3-b858a3919396 was re-scheduled: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information.\n'] [ 755.260883] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 755.261367] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 671639d6-3103-4eeb-86d3-b858a3919396.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 671639d6-3103-4eeb-86d3-b858a3919396. [ 755.261367] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 671639d6-3103-4eeb-86d3-b858a3919396. [ 755.281156] nova-conductor[52217]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.321248] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.326166] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.344114] nova-conductor[52217]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.347443] nova-conductor[52217]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.954939] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance d26dfe85-1a71-48e1-b462-f26f1327a9e7 was re-scheduled: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information.\n'] [ 755.954939] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 755.954939] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d26dfe85-1a71-48e1-b462-f26f1327a9e7.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d26dfe85-1a71-48e1-b462-f26f1327a9e7. [ 755.954939] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d26dfe85-1a71-48e1-b462-f26f1327a9e7. [ 755.972914] nova-conductor[52216]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] deallocate_for_instance() {{(pid=52216) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 756.034503] nova-conductor[52216]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance cache missing network info. {{(pid=52216) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.039038] nova-conductor[52216]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Updating instance_info_cache with network_info: [] {{(pid=52216) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.379715] nova-conductor[52217]: Traceback (most recent call last): [ 756.379715] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 756.379715] nova-conductor[52217]: return func(*args, **kwargs) [ 756.379715] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 756.379715] nova-conductor[52217]: selections = self._select_destinations( [ 756.379715] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 756.379715] nova-conductor[52217]: selections = self._schedule( [ 756.379715] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 756.379715] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 756.379715] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 756.379715] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 756.379715] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.379715] nova-conductor[52217]: ERROR nova.conductor.manager [ 756.393978] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.394224] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.394628] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.465118] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] [instance: 772c63c2-bae6-4eed-a7dc-7f3c4d741349] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 756.465118] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.465118] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.465118] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.468930] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 756.468930] nova-conductor[52217]: Traceback (most recent call last): [ 756.468930] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 756.468930] nova-conductor[52217]: return func(*args, **kwargs) [ 756.468930] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 756.468930] nova-conductor[52217]: selections = self._select_destinations( [ 756.468930] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 756.468930] nova-conductor[52217]: selections = self._schedule( [ 756.468930] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 756.468930] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 756.468930] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 756.468930] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 756.468930] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 756.468930] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.469433] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b3114999-dd8d-4855-9fa6-1b3be24730da tempest-ServersTestJSON-1854174598 tempest-ServersTestJSON-1854174598-project-member] [instance: 772c63c2-bae6-4eed-a7dc-7f3c4d741349] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 757.310614] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance f61f8046-f2ee-4de3-9c45-de52c2849399 was re-scheduled: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information.\n'] [ 757.310614] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 757.310614] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f61f8046-f2ee-4de3-9c45-de52c2849399.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f61f8046-f2ee-4de3-9c45-de52c2849399. [ 757.310614] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f61f8046-f2ee-4de3-9c45-de52c2849399. [ 757.329034] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 757.393493] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.409045] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.493309] nova-conductor[52217]: ERROR nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70 was re-scheduled: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information.\n'] [ 757.494160] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Rescheduling: True {{(pid=52217) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 757.494160] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70. [ 757.499463] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70. [ 757.552565] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] deallocate_for_instance() {{(pid=52217) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 757.598466] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance cache missing network info. {{(pid=52217) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.602705] nova-conductor[52217]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Updating instance_info_cache with network_info: [] {{(pid=52217) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 758.447376] nova-conductor[52217]: Traceback (most recent call last): [ 758.447376] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 758.447376] nova-conductor[52217]: return func(*args, **kwargs) [ 758.447376] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 758.447376] nova-conductor[52217]: selections = self._select_destinations( [ 758.447376] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 758.447376] nova-conductor[52217]: selections = self._schedule( [ 758.447376] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 758.447376] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 758.447376] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 758.447376] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 758.447376] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.447376] nova-conductor[52217]: ERROR nova.conductor.manager [ 758.455610] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.455904] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.456128] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.508014] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] [instance: d40cf2dd-a1df-4001-9981-e9124c5f97ff] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 758.508810] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.509042] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.509217] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.512683] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 758.512683] nova-conductor[52217]: Traceback (most recent call last): [ 758.512683] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 758.512683] nova-conductor[52217]: return func(*args, **kwargs) [ 758.512683] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 758.512683] nova-conductor[52217]: selections = self._select_destinations( [ 758.512683] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 758.512683] nova-conductor[52217]: selections = self._schedule( [ 758.512683] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 758.512683] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 758.512683] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 758.512683] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 758.512683] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 758.512683] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 758.513408] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-b31545d4-f981-4d4d-a604-ae58b920aebc tempest-ServerActionsTestOtherB-111682788 tempest-ServerActionsTestOtherB-111682788-project-member] [instance: d40cf2dd-a1df-4001-9981-e9124c5f97ff] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 759.486119] nova-conductor[52216]: Traceback (most recent call last): [ 759.486119] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 759.486119] nova-conductor[52216]: return func(*args, **kwargs) [ 759.486119] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 759.486119] nova-conductor[52216]: selections = self._select_destinations( [ 759.486119] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 759.486119] nova-conductor[52216]: selections = self._schedule( [ 759.486119] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 759.486119] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 759.486119] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 759.486119] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 759.486119] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.486119] nova-conductor[52216]: ERROR nova.conductor.manager [ 759.496232] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 759.496467] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 759.496637] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.546849] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8878a4a3-6faf-45d1-ab39-75584f4af358] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 759.547792] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 759.547792] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 759.547792] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.553017] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 759.553017] nova-conductor[52216]: Traceback (most recent call last): [ 759.553017] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 759.553017] nova-conductor[52216]: return func(*args, **kwargs) [ 759.553017] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 759.553017] nova-conductor[52216]: selections = self._select_destinations( [ 759.553017] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 759.553017] nova-conductor[52216]: selections = self._schedule( [ 759.553017] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 759.553017] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 759.553017] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 759.553017] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 759.553017] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 759.553017] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 759.553017] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-356e2885-5eaf-4a98-b9f6-c45080d9d23b tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8878a4a3-6faf-45d1-ab39-75584f4af358] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.535247] nova-conductor[52217]: Traceback (most recent call last): [ 760.535247] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.535247] nova-conductor[52217]: return func(*args, **kwargs) [ 760.535247] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.535247] nova-conductor[52217]: selections = self._select_destinations( [ 760.535247] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.535247] nova-conductor[52217]: selections = self._schedule( [ 760.535247] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.535247] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 760.535247] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.535247] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 760.535247] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.535247] nova-conductor[52217]: ERROR nova.conductor.manager [ 760.542885] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.543213] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.543489] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.592563] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] [instance: 28a50b03-6ae6-4a63-a73d-470d9c44e508] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 760.593413] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.593533] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.593694] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.597229] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 760.597229] nova-conductor[52217]: Traceback (most recent call last): [ 760.597229] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.597229] nova-conductor[52217]: return func(*args, **kwargs) [ 760.597229] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.597229] nova-conductor[52217]: selections = self._select_destinations( [ 760.597229] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.597229] nova-conductor[52217]: selections = self._schedule( [ 760.597229] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.597229] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 760.597229] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.597229] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 760.597229] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.597229] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.597738] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c741f7e5-e9a6-4367-919e-d3aef2f714e2 tempest-InstanceActionsV221TestJSON-1302632641 tempest-InstanceActionsV221TestJSON-1302632641-project-member] [instance: 28a50b03-6ae6-4a63-a73d-470d9c44e508] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.947749] nova-conductor[52216]: Traceback (most recent call last): [ 760.947749] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.947749] nova-conductor[52216]: return func(*args, **kwargs) [ 760.947749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.947749] nova-conductor[52216]: selections = self._select_destinations( [ 760.947749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.947749] nova-conductor[52216]: selections = self._schedule( [ 760.947749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.947749] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 760.947749] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.947749] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 760.947749] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.947749] nova-conductor[52216]: ERROR nova.conductor.manager [ 760.962095] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.962328] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.962491] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.010222] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: aab885de-0781-49c2-b5d9-a62e77fb032f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 761.010928] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.011154] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.011324] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.014314] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 761.014314] nova-conductor[52216]: Traceback (most recent call last): [ 761.014314] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.014314] nova-conductor[52216]: return func(*args, **kwargs) [ 761.014314] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.014314] nova-conductor[52216]: selections = self._select_destinations( [ 761.014314] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.014314] nova-conductor[52216]: selections = self._schedule( [ 761.014314] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.014314] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 761.014314] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.014314] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 761.014314] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.014314] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.014828] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-5d89836a-7fac-4c84-a7c5-8bbc27d12117 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: aab885de-0781-49c2-b5d9-a62e77fb032f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.890294] nova-conductor[52217]: Traceback (most recent call last): [ 761.890294] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.890294] nova-conductor[52217]: return func(*args, **kwargs) [ 761.890294] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.890294] nova-conductor[52217]: selections = self._select_destinations( [ 761.890294] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.890294] nova-conductor[52217]: selections = self._schedule( [ 761.890294] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.890294] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 761.890294] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.890294] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 761.890294] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.890294] nova-conductor[52217]: ERROR nova.conductor.manager [ 761.904018] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.904018] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.904018] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.955660] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] [instance: 07007c31-a03a-40a6-be0c-96def8c02f9c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 761.956377] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.956739] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.956972] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.960079] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 761.960079] nova-conductor[52217]: Traceback (most recent call last): [ 761.960079] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.960079] nova-conductor[52217]: return func(*args, **kwargs) [ 761.960079] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.960079] nova-conductor[52217]: selections = self._select_destinations( [ 761.960079] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.960079] nova-conductor[52217]: selections = self._schedule( [ 761.960079] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.960079] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 761.960079] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.960079] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 761.960079] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.960079] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.960658] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7e00bdda-1f31-4e8a-ba4b-cf68b4dc8443 tempest-ServerPasswordTestJSON-1785275738 tempest-ServerPasswordTestJSON-1785275738-project-member] [instance: 07007c31-a03a-40a6-be0c-96def8c02f9c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.320381] nova-conductor[52216]: Traceback (most recent call last): [ 764.320381] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.320381] nova-conductor[52216]: return func(*args, **kwargs) [ 764.320381] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.320381] nova-conductor[52216]: selections = self._select_destinations( [ 764.320381] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.320381] nova-conductor[52216]: selections = self._schedule( [ 764.320381] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.320381] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 764.320381] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.320381] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 764.320381] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.320381] nova-conductor[52216]: ERROR nova.conductor.manager [ 764.329785] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.330010] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.330595] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.374997] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: bdc44205-6c4b-4ffb-a04c-4e390e1820e1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 764.375754] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.375983] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.376150] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.381736] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 764.381736] nova-conductor[52216]: Traceback (most recent call last): [ 764.381736] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.381736] nova-conductor[52216]: return func(*args, **kwargs) [ 764.381736] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.381736] nova-conductor[52216]: selections = self._select_destinations( [ 764.381736] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.381736] nova-conductor[52216]: selections = self._schedule( [ 764.381736] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.381736] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 764.381736] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.381736] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 764.381736] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.381736] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.382302] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-9cf5e2e0-5a41-4473-9909-9dede173dbb0 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: bdc44205-6c4b-4ffb-a04c-4e390e1820e1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.705317] nova-conductor[52217]: Traceback (most recent call last): [ 766.705317] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 766.705317] nova-conductor[52217]: return func(*args, **kwargs) [ 766.705317] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 766.705317] nova-conductor[52217]: selections = self._select_destinations( [ 766.705317] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 766.705317] nova-conductor[52217]: selections = self._schedule( [ 766.705317] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 766.705317] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 766.705317] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 766.705317] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 766.705317] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.705317] nova-conductor[52217]: ERROR nova.conductor.manager [ 766.716502] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.716578] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.717196] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.784046] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] [instance: debf9b56-1b6d-458c-8512-18c8c443df4a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 766.784775] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.784978] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.785171] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.788495] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 766.788495] nova-conductor[52217]: Traceback (most recent call last): [ 766.788495] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 766.788495] nova-conductor[52217]: return func(*args, **kwargs) [ 766.788495] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 766.788495] nova-conductor[52217]: selections = self._select_destinations( [ 766.788495] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 766.788495] nova-conductor[52217]: selections = self._schedule( [ 766.788495] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 766.788495] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 766.788495] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 766.788495] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 766.788495] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 766.788495] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.789029] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-46848e26-6475-4729-aeb8-d20b55cdd8b9 tempest-ServersTestManualDisk-544945067 tempest-ServersTestManualDisk-544945067-project-member] [instance: debf9b56-1b6d-458c-8512-18c8c443df4a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.591867] nova-conductor[52216]: Traceback (most recent call last): [ 767.591867] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.591867] nova-conductor[52216]: return func(*args, **kwargs) [ 767.591867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.591867] nova-conductor[52216]: selections = self._select_destinations( [ 767.591867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.591867] nova-conductor[52216]: selections = self._schedule( [ 767.591867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.591867] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 767.591867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.591867] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 767.591867] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.591867] nova-conductor[52216]: ERROR nova.conductor.manager [ 767.600223] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.600305] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 767.600711] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.664771] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 0e104a1c-61e3-46d2-9536-c8c5ab77e76a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 767.665798] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.665798] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 767.665944] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.670796] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 767.670796] nova-conductor[52216]: Traceback (most recent call last): [ 767.670796] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.670796] nova-conductor[52216]: return func(*args, **kwargs) [ 767.670796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.670796] nova-conductor[52216]: selections = self._select_destinations( [ 767.670796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.670796] nova-conductor[52216]: selections = self._schedule( [ 767.670796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.670796] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 767.670796] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.670796] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 767.670796] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.670796] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.671713] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-11ea7040-2afa-48f4-abac-1aebf018bf0b tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 0e104a1c-61e3-46d2-9536-c8c5ab77e76a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.591461] nova-conductor[52217]: Traceback (most recent call last): [ 769.591461] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 769.591461] nova-conductor[52217]: return func(*args, **kwargs) [ 769.591461] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 769.591461] nova-conductor[52217]: selections = self._select_destinations( [ 769.591461] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 769.591461] nova-conductor[52217]: selections = self._schedule( [ 769.591461] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 769.591461] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 769.591461] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 769.591461] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 769.591461] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.591461] nova-conductor[52217]: ERROR nova.conductor.manager [ 769.602042] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.602271] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.602417] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.646790] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: f8d0b9ac-5319-462f-81c2-9202d61118b6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 769.646790] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.647078] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.647129] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.652675] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 769.652675] nova-conductor[52217]: Traceback (most recent call last): [ 769.652675] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 769.652675] nova-conductor[52217]: return func(*args, **kwargs) [ 769.652675] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 769.652675] nova-conductor[52217]: selections = self._select_destinations( [ 769.652675] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 769.652675] nova-conductor[52217]: selections = self._schedule( [ 769.652675] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 769.652675] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 769.652675] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 769.652675] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 769.652675] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 769.652675] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.653330] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-dad09080-1446-4ae1-b662-c95efbea1a06 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: f8d0b9ac-5319-462f-81c2-9202d61118b6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.156652] nova-conductor[52216]: Traceback (most recent call last): [ 773.156652] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.156652] nova-conductor[52216]: return func(*args, **kwargs) [ 773.156652] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.156652] nova-conductor[52216]: selections = self._select_destinations( [ 773.156652] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.156652] nova-conductor[52216]: selections = self._schedule( [ 773.156652] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.156652] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 773.156652] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.156652] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 773.156652] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.156652] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.166038] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.167030] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.167030] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.243608] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: f42cee0a-1292-4eb5-922f-e0207689ab92] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 773.244399] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.244604] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.244768] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.261572] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 773.261572] nova-conductor[52216]: Traceback (most recent call last): [ 773.261572] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.261572] nova-conductor[52216]: return func(*args, **kwargs) [ 773.261572] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.261572] nova-conductor[52216]: selections = self._select_destinations( [ 773.261572] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.261572] nova-conductor[52216]: selections = self._schedule( [ 773.261572] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.261572] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 773.261572] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.261572] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 773.261572] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.261572] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.266268] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-9a53381a-3066-46c7-b781-95d607d2087f tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: f42cee0a-1292-4eb5-922f-e0207689ab92] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.279273] nova-conductor[52217]: Traceback (most recent call last): [ 773.279273] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.279273] nova-conductor[52217]: return func(*args, **kwargs) [ 773.279273] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.279273] nova-conductor[52217]: selections = self._select_destinations( [ 773.279273] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.279273] nova-conductor[52217]: selections = self._schedule( [ 773.279273] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.279273] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 773.279273] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.279273] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 773.279273] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.279273] nova-conductor[52217]: ERROR nova.conductor.manager [ 773.297842] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.297842] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.298874] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.384213] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] [instance: 7ad816cb-a337-4878-8825-d2625e462f85] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 773.385168] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.385391] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.385573] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.389137] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 773.389137] nova-conductor[52217]: Traceback (most recent call last): [ 773.389137] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.389137] nova-conductor[52217]: return func(*args, **kwargs) [ 773.389137] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.389137] nova-conductor[52217]: selections = self._select_destinations( [ 773.389137] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.389137] nova-conductor[52217]: selections = self._schedule( [ 773.389137] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.389137] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 773.389137] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.389137] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 773.389137] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.389137] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.389654] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-24204e79-1c44-474d-91ba-3340bacdd8f3 tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] [instance: 7ad816cb-a337-4878-8825-d2625e462f85] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.712632] nova-conductor[52216]: Traceback (most recent call last): [ 773.712632] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.712632] nova-conductor[52216]: return func(*args, **kwargs) [ 773.712632] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.712632] nova-conductor[52216]: selections = self._select_destinations( [ 773.712632] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.712632] nova-conductor[52216]: selections = self._schedule( [ 773.712632] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.712632] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 773.712632] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.712632] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 773.712632] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.712632] nova-conductor[52216]: ERROR nova.conductor.manager [ 773.720763] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.721447] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.721447] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.775271] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] [instance: 9912b6f1-c2ae-4d75-a6ac-fd9105ab29f2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 773.775989] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.776221] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.776374] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.781257] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 773.781257] nova-conductor[52216]: Traceback (most recent call last): [ 773.781257] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.781257] nova-conductor[52216]: return func(*args, **kwargs) [ 773.781257] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.781257] nova-conductor[52216]: selections = self._select_destinations( [ 773.781257] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.781257] nova-conductor[52216]: selections = self._schedule( [ 773.781257] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.781257] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 773.781257] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.781257] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 773.781257] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.781257] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.781916] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-46372a5e-337e-429d-aa6b-897bd85c6a0c tempest-ServerShowV247Test-1286037102 tempest-ServerShowV247Test-1286037102-project-member] [instance: 9912b6f1-c2ae-4d75-a6ac-fd9105ab29f2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.220123] nova-conductor[52217]: Traceback (most recent call last): [ 776.220123] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.220123] nova-conductor[52217]: return func(*args, **kwargs) [ 776.220123] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.220123] nova-conductor[52217]: selections = self._select_destinations( [ 776.220123] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.220123] nova-conductor[52217]: selections = self._schedule( [ 776.220123] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.220123] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 776.220123] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.220123] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 776.220123] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.220123] nova-conductor[52217]: ERROR nova.conductor.manager [ 776.230217] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.230217] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.230217] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.279315] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 704626ff-3993-4b82-b7d5-d908d55c9a4c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 776.280110] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.280374] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.280485] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.284177] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 776.284177] nova-conductor[52217]: Traceback (most recent call last): [ 776.284177] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.284177] nova-conductor[52217]: return func(*args, **kwargs) [ 776.284177] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.284177] nova-conductor[52217]: selections = self._select_destinations( [ 776.284177] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.284177] nova-conductor[52217]: selections = self._schedule( [ 776.284177] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.284177] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 776.284177] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.284177] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 776.284177] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.284177] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.284616] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-9a0956fc-17f7-401f-b812-ecf2a880450e tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 704626ff-3993-4b82-b7d5-d908d55c9a4c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.892229] nova-conductor[52216]: Traceback (most recent call last): [ 776.892229] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.892229] nova-conductor[52216]: return func(*args, **kwargs) [ 776.892229] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.892229] nova-conductor[52216]: selections = self._select_destinations( [ 776.892229] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.892229] nova-conductor[52216]: selections = self._schedule( [ 776.892229] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.892229] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 776.892229] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.892229] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 776.892229] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.892229] nova-conductor[52216]: ERROR nova.conductor.manager [ 776.903171] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.903573] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.903907] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.961282] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] [instance: 25de2d67-d314-48d3-ba89-0fd38b9761fa] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 776.962791] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.962791] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.962791] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.965896] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 776.965896] nova-conductor[52216]: Traceback (most recent call last): [ 776.965896] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.965896] nova-conductor[52216]: return func(*args, **kwargs) [ 776.965896] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.965896] nova-conductor[52216]: selections = self._select_destinations( [ 776.965896] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.965896] nova-conductor[52216]: selections = self._schedule( [ 776.965896] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.965896] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 776.965896] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.965896] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 776.965896] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.965896] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.966563] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-96b70f00-3a66-4220-82b8-4756cc715e68 tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] [instance: 25de2d67-d314-48d3-ba89-0fd38b9761fa] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.161684] nova-conductor[52217]: Traceback (most recent call last): [ 777.161684] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.161684] nova-conductor[52217]: return func(*args, **kwargs) [ 777.161684] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.161684] nova-conductor[52217]: selections = self._select_destinations( [ 777.161684] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.161684] nova-conductor[52217]: selections = self._schedule( [ 777.161684] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.161684] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 777.161684] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.161684] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 777.161684] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.161684] nova-conductor[52217]: ERROR nova.conductor.manager [ 777.169051] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 777.169299] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 777.169471] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.231936] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] [instance: 3a64b78c-6913-44ae-b4c7-e07cdb4d58b6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 777.232696] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 777.232909] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 777.233141] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.236438] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 777.236438] nova-conductor[52217]: Traceback (most recent call last): [ 777.236438] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.236438] nova-conductor[52217]: return func(*args, **kwargs) [ 777.236438] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.236438] nova-conductor[52217]: selections = self._select_destinations( [ 777.236438] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.236438] nova-conductor[52217]: selections = self._schedule( [ 777.236438] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.236438] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 777.236438] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.236438] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 777.236438] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.236438] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.237103] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c4af12f0-b848-48b1-b37a-61da8b27450e tempest-InstanceActionsTestJSON-543652899 tempest-InstanceActionsTestJSON-543652899-project-member] [instance: 3a64b78c-6913-44ae-b4c7-e07cdb4d58b6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.845867] nova-conductor[52216]: Traceback (most recent call last): [ 777.845867] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.845867] nova-conductor[52216]: return func(*args, **kwargs) [ 777.845867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.845867] nova-conductor[52216]: selections = self._select_destinations( [ 777.845867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.845867] nova-conductor[52216]: selections = self._schedule( [ 777.845867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.845867] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 777.845867] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.845867] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 777.845867] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.845867] nova-conductor[52216]: ERROR nova.conductor.manager [ 777.860599] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 777.860599] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 777.860599] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.910265] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] [instance: 4877bc24-7582-455f-bff3-6bb5e0bb41f0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 777.910839] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 777.911263] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 777.911263] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.914475] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 777.914475] nova-conductor[52216]: Traceback (most recent call last): [ 777.914475] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.914475] nova-conductor[52216]: return func(*args, **kwargs) [ 777.914475] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.914475] nova-conductor[52216]: selections = self._select_destinations( [ 777.914475] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.914475] nova-conductor[52216]: selections = self._schedule( [ 777.914475] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.914475] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 777.914475] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.914475] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 777.914475] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.914475] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.915186] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-a908d7ce-929d-4a79-98fc-e1ace4a3cf99 tempest-ServerShowV254Test-1821947295 tempest-ServerShowV254Test-1821947295-project-member] [instance: 4877bc24-7582-455f-bff3-6bb5e0bb41f0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 782.688271] nova-conductor[52217]: Traceback (most recent call last): [ 782.688271] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 782.688271] nova-conductor[52217]: return func(*args, **kwargs) [ 782.688271] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 782.688271] nova-conductor[52217]: selections = self._select_destinations( [ 782.688271] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 782.688271] nova-conductor[52217]: selections = self._schedule( [ 782.688271] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 782.688271] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 782.688271] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 782.688271] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 782.688271] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.688271] nova-conductor[52217]: ERROR nova.conductor.manager [ 782.701898] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 782.701898] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 782.701898] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 782.755108] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] [instance: 0d3acb87-b76e-49ba-b428-d668b4f832b0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 782.755843] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 782.756070] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 782.756242] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 782.760873] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 782.760873] nova-conductor[52217]: Traceback (most recent call last): [ 782.760873] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 782.760873] nova-conductor[52217]: return func(*args, **kwargs) [ 782.760873] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 782.760873] nova-conductor[52217]: selections = self._select_destinations( [ 782.760873] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 782.760873] nova-conductor[52217]: selections = self._schedule( [ 782.760873] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 782.760873] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 782.760873] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 782.760873] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 782.760873] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 782.760873] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 782.761826] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-82418922-2a19-40f8-8d29-f1cf4dce3de7 tempest-ServerAddressesNegativeTestJSON-495586381 tempest-ServerAddressesNegativeTestJSON-495586381-project-member] [instance: 0d3acb87-b76e-49ba-b428-d668b4f832b0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.328240] nova-conductor[52216]: Traceback (most recent call last): [ 783.328240] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 783.328240] nova-conductor[52216]: return func(*args, **kwargs) [ 783.328240] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 783.328240] nova-conductor[52216]: selections = self._select_destinations( [ 783.328240] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 783.328240] nova-conductor[52216]: selections = self._schedule( [ 783.328240] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 783.328240] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 783.328240] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 783.328240] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 783.328240] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.328240] nova-conductor[52216]: ERROR nova.conductor.manager [ 783.338272] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 783.338272] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 783.338498] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 783.396029] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] [instance: 88d77439-137f-4c9d-923f-225dd5b8be2b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 783.396029] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 783.396029] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 783.396029] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 783.402420] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 783.402420] nova-conductor[52216]: Traceback (most recent call last): [ 783.402420] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 783.402420] nova-conductor[52216]: return func(*args, **kwargs) [ 783.402420] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 783.402420] nova-conductor[52216]: selections = self._select_destinations( [ 783.402420] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 783.402420] nova-conductor[52216]: selections = self._schedule( [ 783.402420] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 783.402420] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 783.402420] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 783.402420] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 783.402420] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 783.402420] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.403328] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-4c4016e2-10d7-4b60-993e-c59d702f981e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] [instance: 88d77439-137f-4c9d-923f-225dd5b8be2b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.577429] nova-conductor[52217]: Traceback (most recent call last): [ 783.577429] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 783.577429] nova-conductor[52217]: return func(*args, **kwargs) [ 783.577429] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 783.577429] nova-conductor[52217]: selections = self._select_destinations( [ 783.577429] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 783.577429] nova-conductor[52217]: selections = self._schedule( [ 783.577429] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 783.577429] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 783.577429] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 783.577429] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 783.577429] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.577429] nova-conductor[52217]: ERROR nova.conductor.manager [ 783.586381] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 783.586696] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 783.586882] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 783.648026] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] [instance: 8ebfa4fe-4081-4857-bcc8-4e3792822126] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 783.648744] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 783.648955] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 783.649146] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 783.652892] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 783.652892] nova-conductor[52217]: Traceback (most recent call last): [ 783.652892] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 783.652892] nova-conductor[52217]: return func(*args, **kwargs) [ 783.652892] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 783.652892] nova-conductor[52217]: selections = self._select_destinations( [ 783.652892] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 783.652892] nova-conductor[52217]: selections = self._schedule( [ 783.652892] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 783.652892] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 783.652892] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 783.652892] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 783.652892] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 783.652892] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 783.653373] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-32fc7b33-c243-4c35-8be9-ac373a339138 tempest-ServerActionsTestOtherA-1633702584 tempest-ServerActionsTestOtherA-1633702584-project-member] [instance: 8ebfa4fe-4081-4857-bcc8-4e3792822126] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.269085] nova-conductor[52216]: Traceback (most recent call last): [ 784.269085] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 784.269085] nova-conductor[52216]: return func(*args, **kwargs) [ 784.269085] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 784.269085] nova-conductor[52216]: selections = self._select_destinations( [ 784.269085] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 784.269085] nova-conductor[52216]: selections = self._schedule( [ 784.269085] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 784.269085] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 784.269085] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 784.269085] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 784.269085] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.269085] nova-conductor[52216]: ERROR nova.conductor.manager [ 784.279756] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 784.279756] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 784.279756] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 784.333920] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] [instance: 98760198-27e1-4ef9-9c02-13a56a06abed] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 784.334814] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 784.335100] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 784.336145] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 784.340401] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 784.340401] nova-conductor[52216]: Traceback (most recent call last): [ 784.340401] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 784.340401] nova-conductor[52216]: return func(*args, **kwargs) [ 784.340401] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 784.340401] nova-conductor[52216]: selections = self._select_destinations( [ 784.340401] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 784.340401] nova-conductor[52216]: selections = self._schedule( [ 784.340401] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 784.340401] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 784.340401] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 784.340401] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 784.340401] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 784.340401] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.341332] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-747e4e57-21a9-4634-b830-7b2fdcd34aa7 tempest-ServerShowV257Test-2006375458 tempest-ServerShowV257Test-2006375458-project-member] [instance: 98760198-27e1-4ef9-9c02-13a56a06abed] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.551018] nova-conductor[52217]: Traceback (most recent call last): [ 784.551018] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 784.551018] nova-conductor[52217]: return func(*args, **kwargs) [ 784.551018] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 784.551018] nova-conductor[52217]: selections = self._select_destinations( [ 784.551018] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 784.551018] nova-conductor[52217]: selections = self._schedule( [ 784.551018] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 784.551018] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 784.551018] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 784.551018] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 784.551018] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.551018] nova-conductor[52217]: ERROR nova.conductor.manager [ 784.564874] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 784.564939] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 784.565077] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 784.634559] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 17c02358-c0d6-472e-90fb-38f5f3172214] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 784.636664] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 784.636836] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 784.638016] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 784.645287] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 784.645287] nova-conductor[52217]: Traceback (most recent call last): [ 784.645287] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 784.645287] nova-conductor[52217]: return func(*args, **kwargs) [ 784.645287] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 784.645287] nova-conductor[52217]: selections = self._select_destinations( [ 784.645287] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 784.645287] nova-conductor[52217]: selections = self._schedule( [ 784.645287] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 784.645287] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 784.645287] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 784.645287] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 784.645287] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 784.645287] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.645287] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-523a6ca6-3f39-468d-a314-1015529a095f tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 17c02358-c0d6-472e-90fb-38f5f3172214] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 786.329861] nova-conductor[52216]: Traceback (most recent call last): [ 786.329861] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 786.329861] nova-conductor[52216]: return func(*args, **kwargs) [ 786.329861] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 786.329861] nova-conductor[52216]: selections = self._select_destinations( [ 786.329861] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 786.329861] nova-conductor[52216]: selections = self._schedule( [ 786.329861] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 786.329861] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 786.329861] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 786.329861] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 786.329861] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.329861] nova-conductor[52216]: ERROR nova.conductor.manager [ 786.339507] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 786.340224] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 786.341091] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 786.394346] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 3399c738-e838-46d6-b466-56b5707c3459] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 786.394346] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 786.394346] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 786.394346] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 786.400886] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 786.400886] nova-conductor[52216]: Traceback (most recent call last): [ 786.400886] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 786.400886] nova-conductor[52216]: return func(*args, **kwargs) [ 786.400886] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 786.400886] nova-conductor[52216]: selections = self._select_destinations( [ 786.400886] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 786.400886] nova-conductor[52216]: selections = self._schedule( [ 786.400886] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 786.400886] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 786.400886] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 786.400886] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 786.400886] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 786.400886] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 786.401487] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-43961c24-bab1-4e16-9738-e485ca43394e tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 3399c738-e838-46d6-b466-56b5707c3459] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 787.200020] nova-conductor[52217]: Traceback (most recent call last): [ 787.200020] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 787.200020] nova-conductor[52217]: return func(*args, **kwargs) [ 787.200020] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 787.200020] nova-conductor[52217]: selections = self._select_destinations( [ 787.200020] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 787.200020] nova-conductor[52217]: selections = self._schedule( [ 787.200020] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 787.200020] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 787.200020] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 787.200020] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 787.200020] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.200020] nova-conductor[52217]: ERROR nova.conductor.manager [ 787.208814] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 787.208814] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 787.208814] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 787.245603] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] [instance: ac47a325-e2d9-466f-ac34-76eb85f72560] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 787.248099] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 787.248099] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 787.248099] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 787.250877] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 787.250877] nova-conductor[52217]: Traceback (most recent call last): [ 787.250877] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 787.250877] nova-conductor[52217]: return func(*args, **kwargs) [ 787.250877] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 787.250877] nova-conductor[52217]: selections = self._select_destinations( [ 787.250877] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 787.250877] nova-conductor[52217]: selections = self._schedule( [ 787.250877] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 787.250877] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 787.250877] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 787.250877] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 787.250877] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 787.250877] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 787.253510] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7088bded-4d14-436b-8b47-9244afa1838d tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] [instance: ac47a325-e2d9-466f-ac34-76eb85f72560] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.358256] nova-conductor[52216]: Traceback (most recent call last): [ 788.358256] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.358256] nova-conductor[52216]: return func(*args, **kwargs) [ 788.358256] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.358256] nova-conductor[52216]: selections = self._select_destinations( [ 788.358256] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.358256] nova-conductor[52216]: selections = self._schedule( [ 788.358256] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.358256] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 788.358256] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.358256] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 788.358256] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.358256] nova-conductor[52216]: ERROR nova.conductor.manager [ 788.364528] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.364745] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.364927] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.428051] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 94fc46a4-2370-4383-b9b9-ea6cea8ff1a8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 788.428051] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.428051] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.428051] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.431772] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 788.431772] nova-conductor[52216]: Traceback (most recent call last): [ 788.431772] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.431772] nova-conductor[52216]: return func(*args, **kwargs) [ 788.431772] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.431772] nova-conductor[52216]: selections = self._select_destinations( [ 788.431772] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.431772] nova-conductor[52216]: selections = self._schedule( [ 788.431772] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.431772] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 788.431772] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.431772] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 788.431772] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.431772] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.432350] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-7d2d6358-1ac3-42ab-bdf7-f0963deb71cc tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 94fc46a4-2370-4383-b9b9-ea6cea8ff1a8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.468821] nova-conductor[52217]: Traceback (most recent call last): [ 788.468821] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.468821] nova-conductor[52217]: return func(*args, **kwargs) [ 788.468821] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.468821] nova-conductor[52217]: selections = self._select_destinations( [ 788.468821] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.468821] nova-conductor[52217]: selections = self._schedule( [ 788.468821] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.468821] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 788.468821] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.468821] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 788.468821] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.468821] nova-conductor[52217]: ERROR nova.conductor.manager [ 788.475297] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.475575] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.475785] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.511261] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] [instance: 52431646-0d8a-4321-ab53-1dd6de4a31fd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 788.512099] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.512381] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.512608] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.516269] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 788.516269] nova-conductor[52217]: Traceback (most recent call last): [ 788.516269] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.516269] nova-conductor[52217]: return func(*args, **kwargs) [ 788.516269] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.516269] nova-conductor[52217]: selections = self._select_destinations( [ 788.516269] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.516269] nova-conductor[52217]: selections = self._schedule( [ 788.516269] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.516269] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 788.516269] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.516269] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 788.516269] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.516269] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.517035] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-c1652bc2-fb09-4bf9-a9b8-7af94209453e tempest-AttachVolumeNegativeTest-480777919 tempest-AttachVolumeNegativeTest-480777919-project-member] [instance: 52431646-0d8a-4321-ab53-1dd6de4a31fd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.727457] nova-conductor[52216]: Traceback (most recent call last): [ 790.727457] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 790.727457] nova-conductor[52216]: return func(*args, **kwargs) [ 790.727457] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 790.727457] nova-conductor[52216]: selections = self._select_destinations( [ 790.727457] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 790.727457] nova-conductor[52216]: selections = self._schedule( [ 790.727457] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 790.727457] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 790.727457] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 790.727457] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 790.727457] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.727457] nova-conductor[52216]: ERROR nova.conductor.manager [ 790.754211] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 790.754704] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 790.754955] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 790.820378] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 38379301-3897-4ab3-bf75-5f6b3b1ad437] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 790.821141] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 790.821437] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 790.821626] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 790.828525] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 790.828525] nova-conductor[52216]: Traceback (most recent call last): [ 790.828525] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 790.828525] nova-conductor[52216]: return func(*args, **kwargs) [ 790.828525] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 790.828525] nova-conductor[52216]: selections = self._select_destinations( [ 790.828525] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 790.828525] nova-conductor[52216]: selections = self._schedule( [ 790.828525] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 790.828525] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 790.828525] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 790.828525] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 790.828525] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 790.828525] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.829054] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-81757ea2-0d33-4688-a670-2108e037ff34 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 38379301-3897-4ab3-bf75-5f6b3b1ad437] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 790.849753] nova-conductor[52216]: ERROR nova.scheduler.utils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 50bbfdd5-bac5-4634-bc5d-c215a31889e9 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 790.850302] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Rescheduling: True {{(pid=52216) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 790.850711] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50bbfdd5-bac5-4634-bc5d-c215a31889e9.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50bbfdd5-bac5-4634-bc5d-c215a31889e9. [ 790.850941] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50bbfdd5-bac5-4634-bc5d-c215a31889e9. [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 791.623202] nova-conductor[52216]: Traceback (most recent call last): [ 791.623202] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 791.623202] nova-conductor[52216]: return func(*args, **kwargs) [ 791.623202] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 791.623202] nova-conductor[52216]: selections = self._select_destinations( [ 791.623202] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 791.623202] nova-conductor[52216]: selections = self._schedule( [ 791.623202] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 791.623202] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 791.623202] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 791.623202] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 791.623202] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.623202] nova-conductor[52216]: ERROR nova.conductor.manager [ 791.629700] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 791.629932] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 791.630132] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 791.671856] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] [instance: 0db9324f-9f2e-4ce7-85ba-678f3cd7cb1c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 791.672583] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 791.672799] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 791.672963] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 791.675868] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 791.675868] nova-conductor[52216]: Traceback (most recent call last): [ 791.675868] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 791.675868] nova-conductor[52216]: return func(*args, **kwargs) [ 791.675868] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 791.675868] nova-conductor[52216]: selections = self._select_destinations( [ 791.675868] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 791.675868] nova-conductor[52216]: selections = self._schedule( [ 791.675868] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 791.675868] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 791.675868] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 791.675868] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 791.675868] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 791.675868] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 791.676387] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-615dee3b-f265-40f5-8da3-0ae75eceb329 tempest-AttachVolumeTestJSON-81924137 tempest-AttachVolumeTestJSON-81924137-project-member] [instance: 0db9324f-9f2e-4ce7-85ba-678f3cd7cb1c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.231700] nova-conductor[52217]: Traceback (most recent call last): [ 792.231700] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.231700] nova-conductor[52217]: return func(*args, **kwargs) [ 792.231700] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.231700] nova-conductor[52217]: selections = self._select_destinations( [ 792.231700] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.231700] nova-conductor[52217]: selections = self._schedule( [ 792.231700] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.231700] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 792.231700] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.231700] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 792.231700] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.231700] nova-conductor[52217]: ERROR nova.conductor.manager [ 792.239621] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.239847] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.240022] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.307881] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] [instance: 36cd5823-3601-4a33-bd6d-f2dabc408b4f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 792.308728] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.309008] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.309242] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.312794] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 792.312794] nova-conductor[52217]: Traceback (most recent call last): [ 792.312794] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.312794] nova-conductor[52217]: return func(*args, **kwargs) [ 792.312794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.312794] nova-conductor[52217]: selections = self._select_destinations( [ 792.312794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.312794] nova-conductor[52217]: selections = self._schedule( [ 792.312794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.312794] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 792.312794] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.312794] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 792.312794] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.312794] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.313268] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-528a11b6-73f2-48b0-8fa8-be49aa532464 tempest-ServerTagsTestJSON-1565376528 tempest-ServerTagsTestJSON-1565376528-project-member] [instance: 36cd5823-3601-4a33-bd6d-f2dabc408b4f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.856459] nova-conductor[52216]: Traceback (most recent call last): [ 792.856459] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.856459] nova-conductor[52216]: return func(*args, **kwargs) [ 792.856459] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.856459] nova-conductor[52216]: selections = self._select_destinations( [ 792.856459] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.856459] nova-conductor[52216]: selections = self._schedule( [ 792.856459] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.856459] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 792.856459] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.856459] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 792.856459] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager result = self.transport._send( [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager raise result [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._select_destinations( [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager selections = self._schedule( [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.856459] nova-conductor[52216]: ERROR nova.conductor.manager [ 792.868726] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.875128] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.004s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.875128] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.917874] nova-conductor[52216]: DEBUG nova.conductor.manager [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 7243c03a-b7f6-4e38-97ca-2636ccb33212] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52216) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 792.918573] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.918790] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.918955] nova-conductor[52216]: DEBUG oslo_concurrency.lockutils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52216) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.922046] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 792.922046] nova-conductor[52216]: Traceback (most recent call last): [ 792.922046] nova-conductor[52216]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.922046] nova-conductor[52216]: return func(*args, **kwargs) [ 792.922046] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.922046] nova-conductor[52216]: selections = self._select_destinations( [ 792.922046] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.922046] nova-conductor[52216]: selections = self._schedule( [ 792.922046] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.922046] nova-conductor[52216]: self._ensure_sufficient_hosts( [ 792.922046] nova-conductor[52216]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.922046] nova-conductor[52216]: raise exception.NoValidHost(reason=reason) [ 792.922046] nova-conductor[52216]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.922046] nova-conductor[52216]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.922575] nova-conductor[52216]: WARNING nova.scheduler.utils [None req-b7326362-11d0-4823-95dd-360351fdeb15 tempest-ServerDiskConfigTestJSON-583028559 tempest-ServerDiskConfigTestJSON-583028559-project-member] [instance: 7243c03a-b7f6-4e38-97ca-2636ccb33212] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.402997] nova-conductor[52217]: Traceback (most recent call last): [ 794.402997] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.402997] nova-conductor[52217]: return func(*args, **kwargs) [ 794.402997] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.402997] nova-conductor[52217]: selections = self._select_destinations( [ 794.402997] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.402997] nova-conductor[52217]: selections = self._schedule( [ 794.402997] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.402997] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 794.402997] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.402997] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 794.402997] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager result = self.transport._send( [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager raise result [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager Traceback (most recent call last): [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._select_destinations( [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager selections = self._schedule( [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.402997] nova-conductor[52217]: ERROR nova.conductor.manager [ 794.412016] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 794.415766] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.004s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 794.415958] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 794.466374] nova-conductor[52217]: DEBUG nova.conductor.manager [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] [instance: 234e31ed-ee9f-4e39-b52d-f2a713fff45b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='2efa4364-ba59-4de9-978f-169a769ee710',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52217) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 794.467102] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 794.467317] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 794.467479] nova-conductor[52217]: DEBUG oslo_concurrency.lockutils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52217) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 794.470377] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 794.470377] nova-conductor[52217]: Traceback (most recent call last): [ 794.470377] nova-conductor[52217]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 794.470377] nova-conductor[52217]: return func(*args, **kwargs) [ 794.470377] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 794.470377] nova-conductor[52217]: selections = self._select_destinations( [ 794.470377] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 794.470377] nova-conductor[52217]: selections = self._schedule( [ 794.470377] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 794.470377] nova-conductor[52217]: self._ensure_sufficient_hosts( [ 794.470377] nova-conductor[52217]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 794.470377] nova-conductor[52217]: raise exception.NoValidHost(reason=reason) [ 794.470377] nova-conductor[52217]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 794.470377] nova-conductor[52217]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 794.471411] nova-conductor[52217]: WARNING nova.scheduler.utils [None req-7b287fb3-60cd-4faa-9771-c3bce0da9465 tempest-ServersNegativeTestMultiTenantJSON-416670173 tempest-ServersNegativeTestMultiTenantJSON-416670173-project-member] [instance: 234e31ed-ee9f-4e39-b52d-f2a713fff45b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available.