[ 483.996884] nova-conductor[51987]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 485.213461] nova-conductor[51987]: DEBUG oslo_db.sqlalchemy.engines [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51987) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 485.238822] nova-conductor[51987]: DEBUG nova.context [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),b3c1ceaa-bc48-48ee-8b60-930585e76a41(cell1) {{(pid=51987) load_cells /opt/stack/nova/nova/context.py:464}} [ 485.240583] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51987) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 485.240787] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51987) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 485.241246] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=51987) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 485.241588] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51987) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 485.241769] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51987) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 485.242657] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=51987) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 485.247804] nova-conductor[51987]: DEBUG oslo_db.sqlalchemy.engines [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51987) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 485.248181] nova-conductor[51987]: DEBUG oslo_db.sqlalchemy.engines [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51987) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 485.308451] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Acquiring lock "singleton_lock" {{(pid=51987) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 485.308677] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Acquired lock "singleton_lock" {{(pid=51987) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 485.308915] nova-conductor[51987]: DEBUG oslo_concurrency.lockutils [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Releasing lock "singleton_lock" {{(pid=51987) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 485.309335] nova-conductor[51987]: INFO oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Starting 2 workers [ 485.315020] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Started child 52435 {{(pid=51987) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 485.316955] nova-conductor[52435]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 485.317521] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Started child 52436 {{(pid=51987) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 485.318549] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Full set of CONF: {{(pid=51987) wait /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:649}} [ 485.318724] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ******************************************************************************** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 485.319019] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] Configuration options gathered from: {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 485.319105] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] command line args: ['--config-file', '/etc/nova/nova.conf'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 485.319371] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] config files: ['/etc/nova/nova.conf'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 485.319501] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ================================================================================ {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 485.319911] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] allow_resize_to_same_host = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.320133] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] arq_binding_timeout = 300 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.320315] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] block_device_allocate_retries = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323292] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] block_device_allocate_retries_interval = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323292] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cert = self.pem {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323292] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute_driver = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323398] nova-conductor[52436]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 485.323601] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute_monitors = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323601] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] config_dir = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323601] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] config_drive_format = iso9660 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323601] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] config_file = ['/etc/nova/nova.conf'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323601] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] config_source = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323601] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] console_host = devstack {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323601] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] control_exchange = nova {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323774] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cpu_allocation_ratio = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323774] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] daemon = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323774] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] debug = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323774] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] default_access_ip_network_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323774] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] default_availability_zone = nova {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323774] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] default_ephemeral_format = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323920] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.323920] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] default_schedule_zone = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.324012] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] disk_allocation_ratio = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.324185] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] enable_new_services = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.324392] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] enabled_apis = ['osapi_compute'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.324589] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] enabled_ssl_apis = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.324799] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] flat_injected = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.324971] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] force_config_drive = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.325154] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] force_raw_images = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.325338] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] graceful_shutdown_timeout = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.325494] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] heal_instance_info_cache_interval = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.325917] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] host = devstack {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.326142] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.326321] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] initial_disk_allocation_ratio = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328244] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] initial_ram_allocation_ratio = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328244] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328244] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instance_build_timeout = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328244] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instance_delete_interval = 300 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328244] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instance_format = [instance: %(uuid)s] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328244] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instance_name_template = instance-%08x {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328430] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instance_usage_audit = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328430] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instance_usage_audit_period = month {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328430] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328430] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] instances_path = /opt/stack/data/nova/instances {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328430] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] internal_service_availability_zone = internal {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328430] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] key = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328565] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] live_migration_retry_count = 30 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328756] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_config_append = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.328887] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.329067] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_dir = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.329237] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.329368] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_options = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.329579] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_rotate_interval = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.329737] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_rotate_interval_type = days {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.329919] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] log_rotation_type = none {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.330056] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.330180] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.330698] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.330698] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.330698] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.330880] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] long_rpc_timeout = 1800 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.330947] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] max_concurrent_builds = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.331126] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] max_concurrent_live_migrations = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.331291] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] max_concurrent_snapshots = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.331454] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] max_local_block_devices = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.331649] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] max_logfile_count = 30 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.331811] nova-conductor[52436]: DEBUG oslo_db.sqlalchemy.engines [None req-a2f0c89c-0480-4f84-9eab-3c2027094570 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52436) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 485.331867] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] max_logfile_size_mb = 200 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333022] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] maximum_instance_delete_attempts = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333022] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] metadata_listen = 0.0.0.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333022] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] metadata_listen_port = 8775 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333222] nova-conductor[52435]: DEBUG oslo_db.sqlalchemy.engines [None req-2140178d-6975-4c00-b24f-26c82bf8280d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52435) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 485.333270] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] metadata_workers = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333270] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] migrate_max_retries = -1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333270] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] mkisofs_cmd = genisoimage {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333270] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] my_block_storage_ip = 10.180.1.21 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333270] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] my_ip = 10.180.1.21 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333489] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] network_allocate_retries = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333489] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333645] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] osapi_compute_listen = 0.0.0.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333823] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] osapi_compute_listen_port = 8774 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.333981] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] osapi_compute_unique_server_name_scope = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.334154] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] osapi_compute_workers = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.334309] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] password_length = 12 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.334463] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] periodic_enable = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.334621] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] periodic_fuzzy_delay = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.334778] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] pointer_model = usbtablet {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.334968] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] preallocate_images = none {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.335137] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] publish_errors = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.335284] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] pybasedir = /opt/stack/nova {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.335428] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ram_allocation_ratio = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.335577] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rate_limit_burst = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.335749] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rate_limit_except_level = CRITICAL {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.335912] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rate_limit_interval = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.336072] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] reboot_timeout = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.336383] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] reclaim_instance_interval = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.336537] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] record = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.336689] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] reimage_timeout_per_gb = 20 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.336846] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] report_interval = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.337031] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rescue_timeout = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.337184] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] reserved_host_cpus = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.337338] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] reserved_host_disk_mb = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.337490] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] reserved_host_memory_mb = 512 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.337666] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] reserved_huge_pages = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.337811] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] resize_confirm_window = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.337958] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] resize_fs_using_block_device = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.338119] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] resume_guests_state_on_host_boot = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.338278] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.338454] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rpc_response_timeout = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.338628] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] run_external_periodic_tasks = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.338788] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] running_deleted_instance_action = reap {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.338937] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] running_deleted_instance_poll_interval = 1800 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.339097] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] running_deleted_instance_timeout = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.339265] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler_instance_sync_interval = 120 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.339414] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_down_time = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.339604] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] servicegroup_driver = db {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.339753] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] shelved_offload_time = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.339901] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] shelved_poll_interval = 3600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.340067] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] shutdown_timeout = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.340220] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] source_is_ipv6 = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.340371] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ssl_only = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.340552] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] state_path = /opt/stack/data/nova {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.340705] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] sync_power_state_interval = 600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.340870] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] sync_power_state_pool_size = 1000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.341033] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] syslog_log_facility = LOG_USER {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.341183] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] tempdir = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.341349] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] timeout_nbd = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.341543] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] transport_url = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.341708] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] update_resources_interval = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.341889] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] use_cow_images = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.342026] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] use_eventlog = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.342198] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] use_journal = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.342697] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] use_json = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.342697] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] use_rootwrap_daemon = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.342697] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] use_stderr = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345405] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] use_syslog = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345405] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vcpu_pin_set = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345405] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vif_plugging_is_fatal = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345405] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vif_plugging_timeout = 300 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345405] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] virt_mkfs = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345405] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] volume_usage_poll_interval = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345405] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] watch_log_file = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345576] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] web = /usr/share/spice-html5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 485.345576] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_concurrency.disable_process_locking = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345576] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345576] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345576] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345576] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345755] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345755] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345755] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.auth_strategy = keystone {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345755] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.compute_link_prefix = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.345956] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.346097] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.dhcp_domain = novalocal {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.346261] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.enable_instance_password = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.346416] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.glance_link_prefix = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.346600] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.346778] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.346946] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.instance_list_per_project_cells = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.347116] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.list_records_by_skipping_down_cells = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.347271] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.local_metadata_per_cell = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.347428] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.max_limit = 1000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.347588] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.metadata_cache_expiration = 15 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.347754] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.neutron_default_tenant_id = default {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.347913] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.use_forwarded_for = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.348080] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.use_neutron_default_nets = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.348242] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.348400] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.348569] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.348748] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.348993] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.vendordata_dynamic_targets = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.349088] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.vendordata_jsonfile_path = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.349266] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.349522] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.backend = dogpile.cache.memcached {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.349702] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.backend_argument = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.349879] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.config_prefix = cache.oslo {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.350077] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.dead_timeout = 60.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.350255] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.debug_cache_backend = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.350410] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.enable_retry_client = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.350565] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.enable_socket_keepalive = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.350726] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.enabled = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.350900] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.expiration_time = 600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.351079] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.hashclient_retry_attempts = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.351260] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.hashclient_retry_delay = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.351427] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_dead_retry = 300 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.351611] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_password = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.351780] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.351933] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.352103] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_pool_maxsize = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.352259] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.352411] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_sasl_enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.352590] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.352750] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_socket_timeout = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.352911] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.memcache_username = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.353109] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.proxies = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.353285] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.retry_attempts = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.353428] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.retry_delay = 0.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.353606] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.socket_keepalive_count = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.353770] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.socket_keepalive_idle = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.353941] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.socket_keepalive_interval = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.354122] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.tls_allowed_ciphers = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.354205] nova-conductor[52436]: DEBUG nova.service [None req-a2f0c89c-0480-4f84-9eab-3c2027094570 None None] Creating RPC server for service conductor {{(pid=52436) start /opt/stack/nova/nova/service.py:182}} [ 485.354273] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.tls_cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.354419] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.tls_certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.354589] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.tls_enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.354743] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cache.tls_keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.354989] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.355251] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.auth_type = password {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.355419] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.355617] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.catalog_info = volumev3::publicURL {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.355794] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.355948] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.356136] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.cross_az_attach = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.356293] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.debug = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.356444] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.endpoint_template = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.356623] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.http_retries = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.356778] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.356928] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.357126] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.os_region_name = RegionOne {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.357283] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.357432] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cinder.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.cpu_dedicated_set = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.cpu_shared_set = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.image_type_exclude_list = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.max_concurrent_disk_ops = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358758] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.max_disk_devices_to_attach = -1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358758] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.358862] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.359086] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.resource_provider_association_refresh = 300 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.359199] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.shutdown_retry_interval = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.359391] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360044] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] conductor.workers = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360044] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] console.allowed_origins = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360044] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] console.ssl_ciphers = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360209] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] console.ssl_minimum_version = default {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360279] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] consoleauth.token_ttl = 600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360460] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360951] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360951] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.360951] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.connect_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.361135] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.connect_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.361214] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.endpoint_override = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.361366] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.361526] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.361676] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.max_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.361823] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.min_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.361970] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.region_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.362129] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.service_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.362287] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.service_type = accelerator {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.362439] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.362588] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.status_code_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.362737] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.status_code_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.362882] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.363071] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.363212] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] cyborg.version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.363395] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.backend = sqlalchemy {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.363592] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.connection = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.363766] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.connection_debug = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.363932] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.connection_parameters = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.364100] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.connection_recycle_time = 3600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.364260] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.connection_trace = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.364415] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.db_inc_retry_interval = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.364589] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.db_max_retries = 20 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365044] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.db_max_retry_interval = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365044] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.db_retry_interval = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365164] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.max_overflow = 50 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365242] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.max_pool_size = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365399] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.max_retries = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365558] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.mysql_enable_ndb = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365748] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.365901] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.mysql_wsrep_sync_wait = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.366067] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.pool_timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.366230] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.retry_interval = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.366393] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.slave_connection = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.366582] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.sqlite_synchronous = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.366751] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] database.use_db_reconnect = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.366925] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.backend = sqlalchemy {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.367109] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.connection = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.367283] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.connection_debug = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.367868] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.connection_parameters = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.367868] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.connection_recycle_time = 3600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.367868] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.connection_trace = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.368499] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.db_inc_retry_interval = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.368720] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.db_max_retries = 20 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.368900] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.db_max_retry_interval = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.369079] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.db_retry_interval = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.369255] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.max_overflow = 50 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.369420] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.max_pool_size = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.369623] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.max_retries = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.369758] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.mysql_enable_ndb = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.369927] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.370100] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.370265] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.pool_timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.370431] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.retry_interval = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.370589] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.slave_connection = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.370771] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] api_database.sqlite_synchronous = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.370992] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] devices.enabled_mdev_types = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.371199] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.371364] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ephemeral_storage_encryption.enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.371557] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.371759] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.api_servers = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.371919] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.372097] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.372260] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.372414] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.connect_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.372457] nova-conductor[52435]: DEBUG nova.service [None req-2140178d-6975-4c00-b24f-26c82bf8280d None None] Creating RPC server for service conductor {{(pid=52435) start /opt/stack/nova/nova/service.py:182}} [ 485.372575] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.connect_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.372752] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.debug = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.372963] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.default_trusted_certificate_ids = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.373140] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.enable_certificate_validation = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.373301] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.enable_rbd_download = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.373451] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.endpoint_override = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.373660] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.373833] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.373988] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.max_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.374447] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.min_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.374447] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.num_retries = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.374561] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.rbd_ceph_conf = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.374703] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.rbd_connect_timeout = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.374883] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.rbd_pool = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.375061] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.rbd_user = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.375241] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.region_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.375397] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.service_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.375565] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.service_type = image {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.375723] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.375880] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.status_code_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.376041] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.status_code_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.376200] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.376375] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.376536] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.verify_glance_signatures = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.376691] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] glance.version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.376852] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] guestfs.debug = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.377058] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.config_drive_cdrom = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.377223] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.config_drive_inject_password = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.377385] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.377548] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.enable_instance_metrics_collection = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.377706] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.enable_remotefx = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.377868] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.instances_path_share = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.378039] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.iscsi_initiator_list = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.378205] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.limit_cpu_features = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.378365] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.378521] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.378683] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.power_state_check_timeframe = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.378854] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.379041] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.379203] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.use_multipath_io = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.379360] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.volume_attach_retry_count = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.379519] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.379671] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.vswitch_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.379823] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.379985] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] mks.enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.380582] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.380773] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] image_cache.manager_interval = 2400 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.380929] nova-conductor[52436]: DEBUG nova.service [None req-a2f0c89c-0480-4f84-9eab-3c2027094570 None None] Join ServiceGroup membership for this service conductor {{(pid=52436) start /opt/stack/nova/nova/service.py:199}} [ 485.380967] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] image_cache.precache_concurrency = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.381129] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] image_cache.remove_unused_base_images = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.381237] nova-conductor[52436]: DEBUG nova.servicegroup.drivers.db [None req-a2f0c89c-0480-4f84-9eab-3c2027094570 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52436) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 485.381315] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.381532] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.381660] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] image_cache.subdirectory_name = _base {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.381833] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.api_max_retries = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.381994] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.api_retry_interval = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.382166] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.382329] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.auth_type = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.382505] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.382661] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.382843] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.383052] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.connect_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.383244] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.connect_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.383403] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.endpoint_override = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.383586] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.383748] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.383903] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.max_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.384070] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.min_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.384229] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.partition_key = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.384388] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.peer_list = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.384545] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.region_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.384705] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.serial_console_state_timeout = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.384856] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.service_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.385049] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.service_type = baremetal {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.385215] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.385367] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.status_code_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.385520] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.status_code_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.385675] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.385847] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.386259] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ironic.version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.386259] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.386408] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] key_manager.fixed_key = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.386633] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.386810] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.barbican_api_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.386970] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.barbican_endpoint = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.387173] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.barbican_endpoint_type = public {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.387335] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.barbican_region_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.387487] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.387640] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.387796] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.387948] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.388118] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.388303] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.number_of_retries = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.388466] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.retry_delay = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.388653] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.send_service_user_token = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.388818] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.388970] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.389139] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.verify_ssl = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.389291] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican.verify_ssl_path = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.389479] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.389640] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.auth_type = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.389794] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.389941] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.390109] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.390266] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.390418] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.390577] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.390727] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] barbican_service_user.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.390916] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.approle_role_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.391084] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.approle_secret_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.391238] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.391391] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.391584] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.391706] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.391858] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.392055] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.kv_mountpoint = secret {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.392220] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.kv_version = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.392379] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.namespace = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.392531] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.root_token_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.392689] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.392841] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.ssl_ca_crt_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.392993] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.393217] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.use_ssl = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.393418] nova-conductor[52435]: DEBUG nova.service [None req-2140178d-6975-4c00-b24f-26c82bf8280d None None] Join ServiceGroup membership for this service conductor {{(pid=52435) start /opt/stack/nova/nova/service.py:199}} [ 485.393460] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.393684] nova-conductor[52435]: DEBUG nova.servicegroup.drivers.db [None req-2140178d-6975-4c00-b24f-26c82bf8280d None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52435) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 485.393740] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.393864] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.394061] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.394227] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.connect_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.394383] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.connect_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.394560] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.endpoint_override = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.394734] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.394887] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.395061] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.max_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.395215] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.min_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.395368] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.region_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.395521] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.service_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.395693] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.service_type = identity {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.395871] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.396051] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.status_code_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.396231] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.status_code_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.396386] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.396565] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.396723] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] keystone.version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.396962] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.connection_uri = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.397160] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.cpu_mode = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.397322] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.cpu_model_extra_flags = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.397484] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.cpu_models = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.397650] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.cpu_power_governor_high = performance {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.397814] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.cpu_power_governor_low = powersave {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.397970] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.cpu_power_management = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.398169] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.398354] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.device_detach_attempts = 8 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.398512] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.device_detach_timeout = 20 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.398673] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.disk_cachemodes = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.398825] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.disk_prefix = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.398982] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.enabled_perf_events = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.399153] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.file_backed_memory = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.399311] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.gid_maps = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.399490] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.hw_disk_discard = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.399646] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.hw_machine_type = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.399809] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.images_rbd_ceph_conf = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.399967] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.400143] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.400306] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.images_rbd_glance_store_name = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.400470] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.images_rbd_pool = rbd {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.400631] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.images_type = default {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.400790] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.images_volume_group = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.400971] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.inject_key = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.401142] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.inject_partition = -2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.401297] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.inject_password = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.401451] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.iscsi_iface = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.401635] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.iser_use_multipath = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.401800] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_bandwidth = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.401958] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.402128] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_downtime = 500 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.402285] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.402440] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.402597] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_inbound_addr = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.402756] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.402911] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_permit_post_copy = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.403077] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_scheme = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.403242] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_timeout_action = abort {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.403397] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_tunnelled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.403609] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_uri = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.403753] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.live_migration_with_native_tls = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.403932] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.max_queues = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.404109] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.404272] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.nfs_mount_options = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.404628] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.404819] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.404984] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.num_iser_scan_tries = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.405157] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.num_memory_encrypted_guests = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.405317] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.405475] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.num_pcie_ports = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.405635] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.num_volume_scan_tries = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.405857] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.pmem_namespaces = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.406044] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.quobyte_client_cfg = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.406298] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.406466] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rbd_connect_timeout = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.406630] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.406793] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.406947] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rbd_secret_uuid = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.407114] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rbd_user = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.407271] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.407439] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.remote_filesystem_transport = ssh {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.407595] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rescue_image_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.407750] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rescue_kernel_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.407900] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rescue_ramdisk_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.408071] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.408226] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.rx_queue_size = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.408388] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.smbfs_mount_options = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.408602] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.408765] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.snapshot_compression = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.408917] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.snapshot_image_format = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.409138] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.409302] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.sparse_logical_volumes = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.409462] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.swtpm_enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.409630] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.swtpm_group = tss {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.409798] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.swtpm_user = tss {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.409960] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.sysinfo_serial = unique {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.410130] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.tx_queue_size = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.410294] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.uid_maps = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.410453] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.use_virtio_for_bridges = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.410623] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.virt_type = kvm {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.410787] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.volume_clear = zero {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.410943] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.volume_clear_size = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.411118] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.volume_use_multipath = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.411275] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.vzstorage_cache_path = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.411438] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.411600] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.vzstorage_mount_group = qemu {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.411780] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.vzstorage_mount_opts = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.411945] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.412168] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.412340] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.vzstorage_mount_user = stack {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.412501] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.412691] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.412857] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.auth_type = password {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.413024] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.413181] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.413336] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.413516] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.connect_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.413677] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.connect_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.413863] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.default_floating_pool = public {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.414027] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.endpoint_override = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.414188] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.extension_sync_interval = 600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.414343] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.http_retries = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.414502] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.414680] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.414842] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.max_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.415027] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.415178] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.min_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.415343] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.ovs_bridge = br-int {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.415501] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.physnets = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.415666] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.region_name = RegionOne {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.415838] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.service_metadata_proxy = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.416013] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.service_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.416193] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.service_type = network {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.416354] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.416503] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.status_code_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.416655] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.status_code_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.416804] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.416976] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.417143] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] neutron.version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.417309] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] notifications.bdms_in_notifications = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.417487] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] notifications.default_level = INFO {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.417653] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] notifications.notification_format = unversioned {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.417810] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] notifications.notify_on_state_change = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.417978] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.418185] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] pci.alias = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.418350] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] pci.device_spec = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.418509] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] pci.report_in_placement = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.418711] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.418880] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.auth_type = password {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.419075] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.419234] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.419387] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.419565] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.419720] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.connect_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.419878] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.connect_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.420047] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.default_domain_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.420201] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.default_domain_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.420354] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.domain_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.420500] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.domain_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.420656] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.endpoint_override = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.420809] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.420959] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.421127] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.max_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.421278] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.min_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.421441] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.password = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.421593] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.project_domain_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.421753] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.project_domain_name = Default {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.422218] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.project_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.422218] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.project_name = service {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.422283] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.region_name = RegionOne {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.422384] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.service_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.422543] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.service_type = placement {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.422700] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.422853] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.status_code_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.423021] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.status_code_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.423174] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.system_scope = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.423325] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.423482] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.trust_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.423662] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.user_domain_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.423829] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.user_domain_name = Default {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.423989] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.user_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.424168] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.username = placement {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.424344] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.424503] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] placement.version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.424690] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.cores = 20 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.424856] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.count_usage_from_placement = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.425031] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.425221] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.injected_file_content_bytes = 10240 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.425385] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.injected_file_path_length = 255 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.425543] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.injected_files = 5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.425703] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.instances = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.425882] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.key_pairs = 100 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.426080] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.metadata_items = 128 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.426245] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.ram = 51200 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.426410] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.recheck_quota = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.426572] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.server_group_members = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.426731] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] quota.server_groups = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.426891] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rdp.enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.427208] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.427419] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.427608] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.427792] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.image_metadata_prefilter = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.427969] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.428164] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.max_attempts = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.428349] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.max_placement_results = 1000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.428524] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.428687] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.query_placement_for_availability_zone = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.428843] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.query_placement_for_image_type_support = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.429030] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.429231] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] scheduler.workers = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.429411] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.429575] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.429771] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.429937] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.430128] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.430310] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.430467] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.430676] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.430844] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.host_subset_size = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.431012] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.431174] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.431332] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.isolated_hosts = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.431511] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.isolated_images = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.431674] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.431845] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.432027] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.pci_in_placement = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.432206] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.432366] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.432523] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.432678] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.432834] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.432988] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.433182] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.track_instance_changes = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.433355] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.433546] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] metrics.required = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.433719] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] metrics.weight_multiplier = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.433879] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.434050] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] metrics.weight_setting = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.434337] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.434508] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] serial_console.enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.434719] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] serial_console.port_range = 10000:20000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.434893] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.435074] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.435260] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] serial_console.serialproxy_port = 6083 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.435423] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.435594] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.auth_type = password {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.435751] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.435930] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.436110] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.436273] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.436425] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.436592] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.send_service_user_token = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.436753] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.436903] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] service_user.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.437079] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.agent_enabled = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.437263] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.437583] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.437805] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.437971] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.html5proxy_port = 6082 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.438142] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.image_compression = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.438295] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.jpeg_compression = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.438446] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.playback_compression = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.438628] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.server_listen = 127.0.0.1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.438812] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.438966] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.streaming_mode = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.439142] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] spice.zlib_compression = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.439293] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] upgrade_levels.baseapi = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.439444] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] upgrade_levels.cert = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.439609] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] upgrade_levels.compute = auto {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.439759] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] upgrade_levels.conductor = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.439907] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] upgrade_levels.scheduler = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.440081] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.440241] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.auth_type = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.440391] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.440577] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.440790] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.440955] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.441131] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.441302] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.441458] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vendordata_dynamic_auth.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.441655] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.api_retry_count = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.441815] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.ca_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.442007] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.cache_prefix = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.442180] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.cluster_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.442352] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.connection_pool_size = 10 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.442506] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.console_delay_seconds = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.442662] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.datastore_regex = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.442813] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.host_ip = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.442963] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.host_password = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.443130] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.host_port = 443 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.443283] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.host_username = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.443439] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.443623] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.integration_bridge = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.443793] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.maximum_objects = 100 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.443951] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.pbm_default_policy = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.444126] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.pbm_enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.444279] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.pbm_wsdl_location = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.444446] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.444613] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.serial_port_proxy_uri = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.444778] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.serial_port_service_uri = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.444943] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.task_poll_interval = 0.5 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.445114] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.use_linked_clone = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.445280] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.vnc_keymap = en-us {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.445460] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.vnc_port = 5900 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.445621] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vmware.vnc_port_total = 10000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.445830] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.auth_schemes = ['none'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.446018] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.enabled = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.446325] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.446510] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.446685] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.novncproxy_port = 6080 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.446860] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.server_listen = 127.0.0.1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.447040] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.447200] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.vencrypt_ca_certs = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.447354] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.vencrypt_client_cert = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.447506] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] vnc.vencrypt_client_key = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.447708] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.447870] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.disable_deep_image_inspection = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.448038] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.448199] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.448356] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.448512] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.disable_rootwrap = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.448669] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.enable_numa_live_migration = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.448821] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.448975] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.449144] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.449297] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.libvirt_disable_apic = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.449451] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.449604] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.449756] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.449925] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.450147] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.450314] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.450469] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.450624] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.450804] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.450960] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.451151] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.451316] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.client_socket_timeout = 900 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.451475] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.default_pool_size = 1000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.451635] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.keep_alive = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.451794] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.max_header_line = 16384 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.451950] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.secure_proxy_ssl_header = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.452116] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.ssl_ca_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.452270] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.ssl_cert_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.452425] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.ssl_key_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.452583] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.tcp_keepidle = 600 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.452755] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.452914] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] zvm.ca_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.453080] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] zvm.cloud_connector_url = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.453297] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.453459] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] zvm.reachable_timeout = 300 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.453738] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.enforce_new_defaults = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.453946] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.enforce_scope = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.454158] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.policy_default_rule = default {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.454360] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.454551] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.policy_file = policy.yaml {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.454766] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.454946] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.455119] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.455274] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.455452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.455648] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.455842] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.456084] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.connection_string = messaging:// {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.456274] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.enabled = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.456459] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.es_doc_type = notification {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.456637] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.es_scroll_size = 10000 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.456804] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.es_scroll_time = 2m {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.456964] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.filter_error_trace = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.457141] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.hmac_keys = SECRET_KEY {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.457306] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.sentinel_service_name = mymaster {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.457487] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.socket_timeout = 0.1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.457653] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] profiler.trace_sqlalchemy = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.457841] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] remote_debug.host = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.458024] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] remote_debug.port = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.458211] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.458374] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.458535] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.458693] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.458856] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.459023] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.459185] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.459338] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.459493] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.459646] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.459813] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.459980] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.460161] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.460319] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.460474] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.460646] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.460804] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.460957] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.461129] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.461295] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.461452] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.461611] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.461766] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.461923] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.462097] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.462265] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.ssl = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.462431] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.462593] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.462750] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.462913] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.463088] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_rabbit.ssl_version = {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.463293] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.463474] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_notifications.retry = -1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.463683] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.463861] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_messaging_notifications.transport_url = **** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.464075] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.auth_section = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.464238] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.auth_type = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.464392] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.cafile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.464546] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.certfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.464723] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.collect_timing = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.464880] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.connect_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.465044] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.connect_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.465203] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.endpoint_id = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.465353] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.endpoint_override = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.465507] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.insecure = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.465658] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.keyfile = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.465817] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.max_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.465984] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.min_version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.466153] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.region_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.466331] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.service_name = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.466483] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.service_type = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.466639] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.split_loggers = False {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.466789] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.status_code_retries = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.466943] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.status_code_retry_delay = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.467103] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.timeout = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.467256] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.valid_interfaces = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.467406] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_limit.version = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.467605] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_reports.file_event_handler = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.467768] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.467921] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] oslo_reports.log_dir = None {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 485.468059] nova-conductor[51987]: DEBUG oslo_service.service [None req-a9e1c109-6243-43ed-a0af-11a54a8ed630 None None] ******************************************************************************** {{(pid=51987) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 570.334604] nova-conductor[52435]: DEBUG oslo_db.sqlalchemy.engines [None req-7a2f8b6f-4f62-4929-9693-89b865fb2dfc None None] Parent process 51987 forked (52435) with an open database connection, which is being discarded and recreated. {{(pid=52435) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 610.618971] nova-conductor[52436]: DEBUG oslo_db.sqlalchemy.engines [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Parent process 51987 forked (52436) with an open database connection, which is being discarded and recreated. {{(pid=52436) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 610.679703] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Took 0.94 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 610.706550] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.707016] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.711072] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.714678] nova-conductor[52435]: DEBUG oslo_db.sqlalchemy.engines [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52435) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 610.773261] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.773473] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.774327] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.774479] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.774759] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.775545] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.783239] nova-conductor[52435]: DEBUG oslo_db.sqlalchemy.engines [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52435) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 610.806495] nova-conductor[52435]: DEBUG nova.quota [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Getting quotas for project 127a3eb5002944c5a51c17c72f860bca. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 610.809415] nova-conductor[52435]: DEBUG nova.quota [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Getting quotas for user dd051ff4b35d4acfa361115f90e620a3 and project 127a3eb5002944c5a51c17c72f860bca. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 610.815081] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 610.816045] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.816822] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.816822] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.820388] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 610.821074] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.821286] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.821427] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.845143] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.845315] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.845489] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.845770] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52435) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.845952] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52435) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.846479] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.846663] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.846822] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.847195] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.847360] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.847515] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.852994] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Took 0.22 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 610.855424] nova-conductor[52435]: INFO nova.compute.rpcapi [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 610.856683] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b378363e-2928-4242-b524-51a2ac3fd065 None None] Releasing lock "compute-rpcapi-router" {{(pid=52435) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.884042] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.886270] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.886270] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.891142] nova-conductor[52436]: DEBUG oslo_db.sqlalchemy.engines [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52436) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 611.007993] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.007993] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.007993] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.008307] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.008493] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.008647] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.019518] nova-conductor[52436]: DEBUG oslo_db.sqlalchemy.engines [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52436) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 611.044281] nova-conductor[52436]: DEBUG nova.quota [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Getting quotas for project 72a0c826169d4687ab1a83684f443d9a. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 611.048421] nova-conductor[52436]: DEBUG nova.quota [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Getting quotas for user 888b9b022a5449a882fe7877924d1a02 and project 72a0c826169d4687ab1a83684f443d9a. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 611.056522] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 611.056522] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.056522] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.056522] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.060894] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 611.061619] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.061826] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.061991] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.098698] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.098950] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.099151] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.099472] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52436) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.099548] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52436) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.100070] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.100256] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.102254] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.102254] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.102254] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.102254] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.112072] nova-conductor[52436]: INFO nova.compute.rpcapi [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 611.112555] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-024a02d9-8b5b-4487-b459-837b9a619478 None None] Releasing lock "compute-rpcapi-router" {{(pid=52436) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 611.844990] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 611.861103] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.861331] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.861496] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.894387] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.894608] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.894773] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.895139] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.895341] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.895498] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.905448] nova-conductor[52435]: DEBUG nova.quota [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Getting quotas for project a46c26434cd94ea2bcea9461abb7f359. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 611.907500] nova-conductor[52435]: DEBUG nova.quota [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Getting quotas for user a00608717acf4b5db436870c5938dadf and project a46c26434cd94ea2bcea9461abb7f359. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 611.913910] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 611.914553] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.915255] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.915255] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.920333] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 611.921564] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.921564] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.921564] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.935955] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.936104] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.936272] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.044989] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 614.073971] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.074219] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.074385] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.114211] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.114392] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.114574] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.114999] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.115257] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.115450] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.125022] nova-conductor[52436]: DEBUG nova.quota [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Getting quotas for project c1f3e5c6ab1a4a5baf037e03c6f22934. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 614.127544] nova-conductor[52436]: DEBUG nova.quota [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Getting quotas for user c7860a3233494e459d7e7202299108a6 and project c1f3e5c6ab1a4a5baf037e03c6f22934. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 614.134426] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 614.134947] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.135134] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.135386] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.142859] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 614.143657] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.143859] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.144155] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.166302] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.166623] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.166868] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.665850] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 618.682483] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.682710] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.682877] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.727713] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.727855] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.728048] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.728583] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.729070] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.729070] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.739499] nova-conductor[52436]: DEBUG nova.quota [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Getting quotas for project db69c56463ff4c458b8adf0fe0ba520a. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 618.741559] nova-conductor[52436]: DEBUG nova.quota [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Getting quotas for user da5fc0ce38cd43e0ad3a2f69732d0ed6 and project db69c56463ff4c458b8adf0fe0ba520a. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 618.747385] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 618.747821] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.749987] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.750210] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.753124] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 618.753948] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.754030] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.754423] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.773988] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.774216] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.774425] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.055318] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance de589259-86c5-4830-8507-2de7ad76c034 was re-scheduled: Binding failed for port 3f73c04b-d229-4bef-90ed-9080b13ee00b, please check neutron logs for more information.\n'] [ 620.056040] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 620.056393] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance de589259-86c5-4830-8507-2de7ad76c034.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance de589259-86c5-4830-8507-2de7ad76c034. [ 620.057674] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance de589259-86c5-4830-8507-2de7ad76c034. [ 620.126866] nova-conductor[52436]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 620.215196] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 620.236462] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.236682] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.236849] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.316719] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.316719] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.316719] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.316719] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.316719] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.316719] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.337114] nova-conductor[52435]: DEBUG nova.quota [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Getting quotas for project 8a8269fa65f541b098ed07cc3679e448. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 620.339398] nova-conductor[52435]: DEBUG nova.quota [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Getting quotas for user 7e5a2f66d2304de9bf40bde0e896ea71 and project 8a8269fa65f541b098ed07cc3679e448. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 620.348302] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 620.350554] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.350554] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.351380] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.359443] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 620.359443] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.359443] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.359443] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.374909] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 31aed7b3-ea4a-4db4-b919-5d754f4c3b17 was re-scheduled: Binding failed for port 4afc9255-bbe7-49d4-a0f3-47b43a0aa5b8, please check neutron logs for more information.\n'] [ 620.375357] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 620.375396] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 31aed7b3-ea4a-4db4-b919-5d754f4c3b17.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 31aed7b3-ea4a-4db4-b919-5d754f4c3b17. [ 620.375815] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 31aed7b3-ea4a-4db4-b919-5d754f4c3b17. [ 620.439140] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.439362] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.439526] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.476639] nova-conductor[52435]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 620.973361] nova-conductor[52436]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 620.979584] nova-conductor[52436]: DEBUG nova.network.neutron [None req-a857b6cb-517f-46be-9e11-f046c46dffa7 tempest-ServerDiagnosticsNegativeTest-1742310921 tempest-ServerDiagnosticsNegativeTest-1742310921-project-member] [instance: de589259-86c5-4830-8507-2de7ad76c034] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.381506] nova-conductor[52435]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.397680] nova-conductor[52435]: DEBUG nova.network.neutron [None req-e9c433d9-3dfe-422d-a983-2cb49936d076 tempest-ServerExternalEventsTest-1363802782 tempest-ServerExternalEventsTest-1363802782-project-member] [instance: 31aed7b3-ea4a-4db4-b919-5d754f4c3b17] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.109900] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 17c89372-97f2-4ffa-a13e-606d4f31b08f was re-scheduled: Binding failed for port 717ab373-c7e5-4f17-99ea-5059dde6cc33, please check neutron logs for more information.\n'] [ 622.110632] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 622.110816] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 17c89372-97f2-4ffa-a13e-606d4f31b08f.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 17c89372-97f2-4ffa-a13e-606d4f31b08f. [ 622.111136] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 17c89372-97f2-4ffa-a13e-606d4f31b08f. [ 622.155411] nova-conductor[52436]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 622.193531] nova-conductor[52436]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 622.198868] nova-conductor[52436]: DEBUG nova.network.neutron [None req-b939a52d-a082-4bd3-8122-f8977f528894 tempest-ServerDiagnosticsTest-1815847417 tempest-ServerDiagnosticsTest-1815847417-project-member] [instance: 17c89372-97f2-4ffa-a13e-606d4f31b08f] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 624.309910] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Took 0.26 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 624.327786] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.328153] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.329746] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.371969] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.372848] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.372848] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.372957] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.373080] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.373240] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.382492] nova-conductor[52436]: DEBUG nova.quota [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Getting quotas for project 6a76486869574011a18c114d723d40aa. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 624.384806] nova-conductor[52436]: DEBUG nova.quota [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Getting quotas for user 9a022db29b5046919ef98582dc088fd2 and project 6a76486869574011a18c114d723d40aa. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 624.391913] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 624.391913] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.391990] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.392292] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.394908] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 624.395836] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.395905] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.396079] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.414022] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.414022] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.414022] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.370913] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 627.386617] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.386919] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.387275] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.419752] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.419975] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.420127] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.420490] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.420766] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.420853] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.429806] nova-conductor[52436]: DEBUG nova.quota [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Getting quotas for project 1951f850b8e14cf783d324d2842664b8. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 627.434082] nova-conductor[52436]: DEBUG nova.quota [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Getting quotas for user 9d11f36aa5714e778b9b89d84d55f3b8 and project 1951f850b8e14cf783d324d2842664b8. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 627.438458] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 627.438956] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.439183] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.439463] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.442715] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 627.443596] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.443596] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.443761] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.458037] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.458354] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.458443] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.930023] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 630.952589] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.952828] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.952994] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.015117] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.015336] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.015568] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.015954] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.016149] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.016317] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.037416] nova-conductor[52436]: DEBUG nova.quota [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Getting quotas for project 38e80b51cffa497d967daa587f7880af. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 631.041145] nova-conductor[52436]: DEBUG nova.quota [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Getting quotas for user 4aaadc72c029484cb6f29af2622ebe85 and project 38e80b51cffa497d967daa587f7880af. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 631.047861] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 631.049460] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.050137] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.050137] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.061166] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.061652] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.061792] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.062207] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.110016] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.110016] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.110016] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.787788] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Took 0.20 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 635.825758] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.825999] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.826172] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.841078] nova-conductor[52436]: Traceback (most recent call last): [ 635.841078] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.841078] nova-conductor[52436]: return func(*args, **kwargs) [ 635.841078] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.841078] nova-conductor[52436]: selections = self._select_destinations( [ 635.841078] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.841078] nova-conductor[52436]: selections = self._schedule( [ 635.841078] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.841078] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 635.841078] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.841078] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 635.841078] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.841078] nova-conductor[52436]: ERROR nova.conductor.manager [ 635.846937] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.847194] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.847427] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.897059] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.897308] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.897488] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.897853] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.898062] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.898236] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.909639] nova-conductor[52435]: DEBUG nova.quota [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Getting quotas for project d020d014854949569b330b6a18e177dc. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 635.913292] nova-conductor[52435]: DEBUG nova.quota [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Getting quotas for user 05f862aa6748451a96b89a90ebbfa4a5 and project d020d014854949569b330b6a18e177dc. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 635.922203] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 635.922907] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.922907] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.922907] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.925852] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 635.926586] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.927249] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.927249] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.936611] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] [instance: a5a50925-a2b3-403b-8c0e-b36be3da27be] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 635.937577] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.937577] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.938027] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.947020] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.947020] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.947020] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.947944] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 635.947944] nova-conductor[52436]: Traceback (most recent call last): [ 635.947944] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.947944] nova-conductor[52436]: return func(*args, **kwargs) [ 635.947944] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.947944] nova-conductor[52436]: selections = self._select_destinations( [ 635.947944] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.947944] nova-conductor[52436]: selections = self._schedule( [ 635.947944] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.947944] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 635.947944] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.947944] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 635.947944] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.947944] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.948641] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b4cffcf7-43ac-42c4-aaa0-b87fe1f095c8 tempest-ServerActionsTestOtherA-1047399784 tempest-ServerActionsTestOtherA-1047399784-project-member] [instance: a5a50925-a2b3-403b-8c0e-b36be3da27be] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 637.325995] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance fddaa828-8da4-4d5d-80d5-484ccf2ab6b8 was re-scheduled: Binding failed for port d997de27-f617-40ab-a1b9-b240dae29fb1, please check neutron logs for more information.\n'] [ 637.327081] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 637.327722] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance fddaa828-8da4-4d5d-80d5-484ccf2ab6b8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance fddaa828-8da4-4d5d-80d5-484ccf2ab6b8. [ 637.328144] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance fddaa828-8da4-4d5d-80d5-484ccf2ab6b8. [ 637.359658] nova-conductor[52436]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 637.456488] nova-conductor[52436]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.461684] nova-conductor[52436]: DEBUG nova.network.neutron [None req-be654e08-8ac1-4963-a18f-499ccfd52a46 tempest-InstanceActionsV221TestJSON-1809115902 tempest-InstanceActionsV221TestJSON-1809115902-project-member] [instance: fddaa828-8da4-4d5d-80d5-484ccf2ab6b8] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 638.610229] nova-conductor[52436]: Traceback (most recent call last): [ 638.610229] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 638.610229] nova-conductor[52436]: return func(*args, **kwargs) [ 638.610229] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 638.610229] nova-conductor[52436]: selections = self._select_destinations( [ 638.610229] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 638.610229] nova-conductor[52436]: selections = self._schedule( [ 638.610229] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 638.610229] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 638.610229] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 638.610229] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 638.610229] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.610229] nova-conductor[52436]: ERROR nova.conductor.manager [ 638.617843] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.618118] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.619964] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.682684] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] [instance: ff9fa823-03c3-4b80-b06a-ca7d4aed62a5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 638.683444] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.683901] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.683901] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.689891] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 638.689891] nova-conductor[52436]: Traceback (most recent call last): [ 638.689891] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 638.689891] nova-conductor[52436]: return func(*args, **kwargs) [ 638.689891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 638.689891] nova-conductor[52436]: selections = self._select_destinations( [ 638.689891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 638.689891] nova-conductor[52436]: selections = self._schedule( [ 638.689891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 638.689891] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 638.689891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 638.689891] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 638.689891] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 638.689891] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 638.690266] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-d14c37c8-3098-455a-94b8-5e9d20a37799 tempest-FloatingIPsAssociationNegativeTestJSON-531196839 tempest-FloatingIPsAssociationNegativeTestJSON-531196839-project-member] [instance: ff9fa823-03c3-4b80-b06a-ca7d4aed62a5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.720627] nova-conductor[52435]: Traceback (most recent call last): [ 640.720627] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.720627] nova-conductor[52435]: return func(*args, **kwargs) [ 640.720627] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.720627] nova-conductor[52435]: selections = self._select_destinations( [ 640.720627] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.720627] nova-conductor[52435]: selections = self._schedule( [ 640.720627] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.720627] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 640.720627] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.720627] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 640.720627] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.720627] nova-conductor[52435]: ERROR nova.conductor.manager [ 640.748776] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.748776] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.748776] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.833391] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] [instance: 191495fb-521b-42bc-889c-17b7b48776f7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 640.834137] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.834349] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.834511] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.840650] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 640.840650] nova-conductor[52435]: Traceback (most recent call last): [ 640.840650] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.840650] nova-conductor[52435]: return func(*args, **kwargs) [ 640.840650] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.840650] nova-conductor[52435]: selections = self._select_destinations( [ 640.840650] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.840650] nova-conductor[52435]: selections = self._schedule( [ 640.840650] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.840650] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 640.840650] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.840650] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 640.840650] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.840650] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.842647] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-e7d5cb5c-fd56-43d9-b417-ddd5c8e5de8e tempest-ServersWithSpecificFlavorTestJSON-1295992508 tempest-ServersWithSpecificFlavorTestJSON-1295992508-project-member] [instance: 191495fb-521b-42bc-889c-17b7b48776f7] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.871443] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance a5eb8727-e918-4a86-9e40-fe20817ca13c was re-scheduled: Binding failed for port 241b104a-67d3-4121-896b-95c8e8ec95a2, please check neutron logs for more information.\n'] [ 640.871989] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 640.873311] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a5eb8727-e918-4a86-9e40-fe20817ca13c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a5eb8727-e918-4a86-9e40-fe20817ca13c. [ 640.873311] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a5eb8727-e918-4a86-9e40-fe20817ca13c. [ 640.898853] nova-conductor[52435]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 640.998845] nova-conductor[52435]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 641.008045] nova-conductor[52435]: DEBUG nova.network.neutron [None req-457641b5-38a3-4e74-abd8-79b7f8e5a784 tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: a5eb8727-e918-4a86-9e40-fe20817ca13c] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.191731] nova-conductor[52435]: Traceback (most recent call last): [ 644.191731] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.191731] nova-conductor[52435]: return func(*args, **kwargs) [ 644.191731] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.191731] nova-conductor[52435]: selections = self._select_destinations( [ 644.191731] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.191731] nova-conductor[52435]: selections = self._schedule( [ 644.191731] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.191731] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 644.191731] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.191731] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 644.191731] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.191731] nova-conductor[52435]: ERROR nova.conductor.manager [ 644.203741] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.203972] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.204158] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.284870] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 8cc49d4d-3c8b-41a5-afaa-23f90a45e18a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 644.286246] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.286246] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.286246] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.291283] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 644.291283] nova-conductor[52435]: Traceback (most recent call last): [ 644.291283] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.291283] nova-conductor[52435]: return func(*args, **kwargs) [ 644.291283] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.291283] nova-conductor[52435]: selections = self._select_destinations( [ 644.291283] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.291283] nova-conductor[52435]: selections = self._schedule( [ 644.291283] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.291283] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 644.291283] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.291283] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 644.291283] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.291283] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.291740] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-0d1fe599-76bf-49c4-840a-42171f76a5e7 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 8cc49d4d-3c8b-41a5-afaa-23f90a45e18a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.349289] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 34ac1a4c-c729-458f-853f-593e0c935f4c was re-scheduled: Binding failed for port 5725e74a-7746-478f-a0b4-0542854b6712, please check neutron logs for more information.\n'] [ 644.350294] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 644.350294] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 34ac1a4c-c729-458f-853f-593e0c935f4c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 34ac1a4c-c729-458f-853f-593e0c935f4c. [ 644.350294] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 34ac1a4c-c729-458f-853f-593e0c935f4c. [ 644.373828] nova-conductor[52435]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 644.398691] nova-conductor[52435]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 644.401383] nova-conductor[52435]: DEBUG nova.network.neutron [None req-6f5cd38c-1f19-4576-b385-6d5f7753d8c3 tempest-VolumesAssistedSnapshotsTest-1696972646 tempest-VolumesAssistedSnapshotsTest-1696972646-project-member] [instance: 34ac1a4c-c729-458f-853f-593e0c935f4c] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.488963] nova-conductor[52435]: Traceback (most recent call last): [ 646.488963] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.488963] nova-conductor[52435]: return func(*args, **kwargs) [ 646.488963] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.488963] nova-conductor[52435]: selections = self._select_destinations( [ 646.488963] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.488963] nova-conductor[52435]: selections = self._schedule( [ 646.488963] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.488963] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 646.488963] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.488963] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 646.488963] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.488963] nova-conductor[52435]: ERROR nova.conductor.manager [ 646.500206] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.500449] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.500625] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.567498] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] [instance: 6f2f45ab-c57e-40e7-aaa3-789533cec95a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 646.568172] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.568378] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.568543] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.572530] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 646.572530] nova-conductor[52435]: Traceback (most recent call last): [ 646.572530] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.572530] nova-conductor[52435]: return func(*args, **kwargs) [ 646.572530] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.572530] nova-conductor[52435]: selections = self._select_destinations( [ 646.572530] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.572530] nova-conductor[52435]: selections = self._schedule( [ 646.572530] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.572530] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 646.572530] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.572530] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 646.572530] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.572530] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.572530] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9bea4cac-3f43-41d7-bd5f-8c0d4eca91a6 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] [instance: 6f2f45ab-c57e-40e7-aaa3-789533cec95a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 647.325309] nova-conductor[52436]: Traceback (most recent call last): [ 647.325309] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 647.325309] nova-conductor[52436]: return func(*args, **kwargs) [ 647.325309] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 647.325309] nova-conductor[52436]: selections = self._select_destinations( [ 647.325309] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 647.325309] nova-conductor[52436]: selections = self._schedule( [ 647.325309] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 647.325309] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 647.325309] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 647.325309] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 647.325309] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.325309] nova-conductor[52436]: ERROR nova.conductor.manager [ 647.340466] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.340466] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.340466] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.402936] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] [instance: b280463d-455f-4ccd-b8c8-4451c764a8b9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 647.404402] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 647.404402] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 647.404402] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.407184] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 647.407184] nova-conductor[52436]: Traceback (most recent call last): [ 647.407184] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 647.407184] nova-conductor[52436]: return func(*args, **kwargs) [ 647.407184] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 647.407184] nova-conductor[52436]: selections = self._select_destinations( [ 647.407184] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 647.407184] nova-conductor[52436]: selections = self._schedule( [ 647.407184] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 647.407184] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 647.407184] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 647.407184] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 647.407184] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 647.407184] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 647.407696] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-bc88ed1b-88b6-443f-9ef3-543814d53d8c tempest-AttachInterfacesUnderV243Test-1695532587 tempest-AttachInterfacesUnderV243Test-1695532587-project-member] [instance: b280463d-455f-4ccd-b8c8-4451c764a8b9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.639503] nova-conductor[52435]: Traceback (most recent call last): [ 648.639503] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.639503] nova-conductor[52435]: return func(*args, **kwargs) [ 648.639503] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.639503] nova-conductor[52435]: selections = self._select_destinations( [ 648.639503] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.639503] nova-conductor[52435]: selections = self._schedule( [ 648.639503] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.639503] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 648.639503] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.639503] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 648.639503] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.639503] nova-conductor[52435]: ERROR nova.conductor.manager [ 648.647294] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.647677] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.647677] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.715615] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] [instance: 4333c7c7-1354-4718-affb-4a5d8e85c8b8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 648.719424] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.719424] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.719424] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.724928] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 648.724928] nova-conductor[52435]: Traceback (most recent call last): [ 648.724928] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.724928] nova-conductor[52435]: return func(*args, **kwargs) [ 648.724928] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.724928] nova-conductor[52435]: selections = self._select_destinations( [ 648.724928] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.724928] nova-conductor[52435]: selections = self._schedule( [ 648.724928] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.724928] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 648.724928] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.724928] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 648.724928] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.724928] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.727540] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-b2e5abed-f3bb-442c-83c8-34540b13dbd8 tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] [instance: 4333c7c7-1354-4718-affb-4a5d8e85c8b8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.348815] nova-conductor[52436]: Traceback (most recent call last): [ 650.348815] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.348815] nova-conductor[52436]: return func(*args, **kwargs) [ 650.348815] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.348815] nova-conductor[52436]: selections = self._select_destinations( [ 650.348815] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.348815] nova-conductor[52436]: selections = self._schedule( [ 650.348815] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.348815] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 650.348815] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.348815] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 650.348815] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.348815] nova-conductor[52436]: ERROR nova.conductor.manager [ 650.355852] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.355852] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.355988] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.431374] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] [instance: 39813970-ac66-4859-bd6f-d5fcfc905cf7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 650.432346] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.434645] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.434645] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.439460] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 650.439460] nova-conductor[52436]: Traceback (most recent call last): [ 650.439460] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.439460] nova-conductor[52436]: return func(*args, **kwargs) [ 650.439460] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.439460] nova-conductor[52436]: selections = self._select_destinations( [ 650.439460] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.439460] nova-conductor[52436]: selections = self._schedule( [ 650.439460] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.439460] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 650.439460] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.439460] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 650.439460] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.439460] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.439908] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-01bb32f7-f61b-4f1a-a2a8-453f90ee6694 tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] [instance: 39813970-ac66-4859-bd6f-d5fcfc905cf7] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.488768] nova-conductor[52435]: Traceback (most recent call last): [ 650.488768] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.488768] nova-conductor[52435]: return func(*args, **kwargs) [ 650.488768] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.488768] nova-conductor[52435]: selections = self._select_destinations( [ 650.488768] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.488768] nova-conductor[52435]: selections = self._schedule( [ 650.488768] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.488768] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 650.488768] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.488768] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 650.488768] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.488768] nova-conductor[52435]: ERROR nova.conductor.manager [ 650.496983] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.497157] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.497345] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.543875] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] [instance: d77b4c95-2b73-4752-923a-c503843ee63a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 650.544591] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.544798] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.544967] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.547964] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 650.547964] nova-conductor[52435]: Traceback (most recent call last): [ 650.547964] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.547964] nova-conductor[52435]: return func(*args, **kwargs) [ 650.547964] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.547964] nova-conductor[52435]: selections = self._select_destinations( [ 650.547964] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.547964] nova-conductor[52435]: selections = self._schedule( [ 650.547964] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.547964] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 650.547964] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.547964] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 650.547964] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.547964] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.548485] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-cc6b45e1-6a18-4995-b2a6-ddb3bd8e28ee tempest-ServersTestFqdnHostnames-424581684 tempest-ServersTestFqdnHostnames-424581684-project-member] [instance: d77b4c95-2b73-4752-923a-c503843ee63a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.478018] nova-conductor[52436]: Traceback (most recent call last): [ 652.478018] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.478018] nova-conductor[52436]: return func(*args, **kwargs) [ 652.478018] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.478018] nova-conductor[52436]: selections = self._select_destinations( [ 652.478018] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.478018] nova-conductor[52436]: selections = self._schedule( [ 652.478018] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.478018] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 652.478018] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.478018] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 652.478018] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.478018] nova-conductor[52436]: ERROR nova.conductor.manager [ 652.487237] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 652.488139] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 652.488363] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 652.553050] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] [instance: ea77fc31-2f6f-4724-b885-68f9c2823b97] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 652.553834] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 652.554053] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 652.554224] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 652.558347] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 652.558347] nova-conductor[52436]: Traceback (most recent call last): [ 652.558347] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.558347] nova-conductor[52436]: return func(*args, **kwargs) [ 652.558347] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.558347] nova-conductor[52436]: selections = self._select_destinations( [ 652.558347] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.558347] nova-conductor[52436]: selections = self._schedule( [ 652.558347] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.558347] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 652.558347] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.558347] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 652.558347] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.558347] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.558729] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6fe357e6-4474-4872-9e40-7846fb2cd19c tempest-ServersAdminTestJSON-2079283119 tempest-ServersAdminTestJSON-2079283119-project-member] [instance: ea77fc31-2f6f-4724-b885-68f9c2823b97] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.831993] nova-conductor[52435]: Traceback (most recent call last): [ 653.831993] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 653.831993] nova-conductor[52435]: return func(*args, **kwargs) [ 653.831993] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 653.831993] nova-conductor[52435]: selections = self._select_destinations( [ 653.831993] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 653.831993] nova-conductor[52435]: selections = self._schedule( [ 653.831993] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 653.831993] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 653.831993] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 653.831993] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 653.831993] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.831993] nova-conductor[52435]: ERROR nova.conductor.manager [ 653.839653] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.839886] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.840099] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 653.892115] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] [instance: 0c93fe4a-0c24-42f9-b8be-9e037707910c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 653.892849] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.893075] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.893250] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 653.900013] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 653.900013] nova-conductor[52435]: Traceback (most recent call last): [ 653.900013] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 653.900013] nova-conductor[52435]: return func(*args, **kwargs) [ 653.900013] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 653.900013] nova-conductor[52435]: selections = self._select_destinations( [ 653.900013] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 653.900013] nova-conductor[52435]: selections = self._schedule( [ 653.900013] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 653.900013] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 653.900013] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 653.900013] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 653.900013] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 653.900013] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.900495] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9a438798-dac3-404e-9394-4538cf46ec5b tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] [instance: 0c93fe4a-0c24-42f9-b8be-9e037707910c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.252528] nova-conductor[52436]: Traceback (most recent call last): [ 655.252528] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.252528] nova-conductor[52436]: return func(*args, **kwargs) [ 655.252528] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.252528] nova-conductor[52436]: selections = self._select_destinations( [ 655.252528] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.252528] nova-conductor[52436]: selections = self._schedule( [ 655.252528] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.252528] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 655.252528] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.252528] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 655.252528] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.252528] nova-conductor[52436]: ERROR nova.conductor.manager [ 655.264969] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.266018] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.266018] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.348588] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] [instance: 7d4695d2-b371-45ac-b167-4e1c9e2258b8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 655.349332] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.349545] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.349707] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.356900] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 655.356900] nova-conductor[52436]: Traceback (most recent call last): [ 655.356900] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.356900] nova-conductor[52436]: return func(*args, **kwargs) [ 655.356900] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.356900] nova-conductor[52436]: selections = self._select_destinations( [ 655.356900] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.356900] nova-conductor[52436]: selections = self._schedule( [ 655.356900] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.356900] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 655.356900] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.356900] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 655.356900] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.356900] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.356900] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-00d2871a-97b4-451b-8c2e-fefe11af7dbd tempest-ListImageFiltersTestJSON-1511818971 tempest-ListImageFiltersTestJSON-1511818971-project-member] [instance: 7d4695d2-b371-45ac-b167-4e1c9e2258b8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.189946] nova-conductor[52435]: Traceback (most recent call last): [ 656.189946] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.189946] nova-conductor[52435]: return func(*args, **kwargs) [ 656.189946] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.189946] nova-conductor[52435]: selections = self._select_destinations( [ 656.189946] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.189946] nova-conductor[52435]: selections = self._schedule( [ 656.189946] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.189946] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 656.189946] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.189946] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 656.189946] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.189946] nova-conductor[52435]: ERROR nova.conductor.manager [ 656.197471] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.197823] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.198118] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.242455] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: 52ed178b-2e1e-437b-8fbd-c5a6854c7216] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 656.243212] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.243423] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.243650] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.249495] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 656.249495] nova-conductor[52435]: Traceback (most recent call last): [ 656.249495] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 656.249495] nova-conductor[52435]: return func(*args, **kwargs) [ 656.249495] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 656.249495] nova-conductor[52435]: selections = self._select_destinations( [ 656.249495] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 656.249495] nova-conductor[52435]: selections = self._schedule( [ 656.249495] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 656.249495] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 656.249495] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 656.249495] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 656.249495] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 656.249495] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 656.253125] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-785c2bdc-f924-4f7e-94e9-eefd2afbedab tempest-MigrationsAdminTest-1959481632 tempest-MigrationsAdminTest-1959481632-project-member] [instance: 52ed178b-2e1e-437b-8fbd-c5a6854c7216] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.587114] nova-conductor[52436]: Traceback (most recent call last): [ 659.587114] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.587114] nova-conductor[52436]: return func(*args, **kwargs) [ 659.587114] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.587114] nova-conductor[52436]: selections = self._select_destinations( [ 659.587114] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.587114] nova-conductor[52436]: selections = self._schedule( [ 659.587114] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.587114] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 659.587114] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.587114] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 659.587114] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.587114] nova-conductor[52436]: ERROR nova.conductor.manager [ 659.598748] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.598748] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.598748] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.667154] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] [instance: 3307e380-73e7-4d90-8dea-58c219b8447c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 659.667154] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.667154] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.667154] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.674404] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 659.674404] nova-conductor[52436]: Traceback (most recent call last): [ 659.674404] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 659.674404] nova-conductor[52436]: return func(*args, **kwargs) [ 659.674404] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 659.674404] nova-conductor[52436]: selections = self._select_destinations( [ 659.674404] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 659.674404] nova-conductor[52436]: selections = self._schedule( [ 659.674404] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 659.674404] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 659.674404] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 659.674404] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 659.674404] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 659.674404] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 659.675034] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-678ce3a0-91a0-43d1-a744-01bdbb8e629e tempest-VolumesAdminNegativeTest-2121041050 tempest-VolumesAdminNegativeTest-2121041050-project-member] [instance: 3307e380-73e7-4d90-8dea-58c219b8447c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.611886] nova-conductor[52435]: Traceback (most recent call last): [ 660.611886] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.611886] nova-conductor[52435]: return func(*args, **kwargs) [ 660.611886] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.611886] nova-conductor[52435]: selections = self._select_destinations( [ 660.611886] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.611886] nova-conductor[52435]: selections = self._schedule( [ 660.611886] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.611886] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 660.611886] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.611886] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 660.611886] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.611886] nova-conductor[52435]: ERROR nova.conductor.manager [ 660.624009] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.624248] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.624419] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.688878] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] [instance: b2bbc2d6-4faa-4972-8d1e-2bf5c37dcd64] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 660.689656] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 660.689656] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.689781] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.697678] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 660.697678] nova-conductor[52435]: Traceback (most recent call last): [ 660.697678] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 660.697678] nova-conductor[52435]: return func(*args, **kwargs) [ 660.697678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 660.697678] nova-conductor[52435]: selections = self._select_destinations( [ 660.697678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 660.697678] nova-conductor[52435]: selections = self._schedule( [ 660.697678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 660.697678] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 660.697678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 660.697678] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 660.697678] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 660.697678] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 660.697678] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-84d61112-fda5-426b-b554-af5c6f22ec88 tempest-ImagesNegativeTestJSON-47176368 tempest-ImagesNegativeTestJSON-47176368-project-member] [instance: b2bbc2d6-4faa-4972-8d1e-2bf5c37dcd64] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.233880] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 21e37459-3ce1-41e7-8317-b98edafb15d4 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 666.238192] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 666.238192] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 21e37459-3ce1-41e7-8317-b98edafb15d4.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 21e37459-3ce1-41e7-8317-b98edafb15d4. [ 666.238192] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9165ad98-f77c-4d6a-907a-01a0c824e14e tempest-ServerDiagnosticsV248Test-2090958171 tempest-ServerDiagnosticsV248Test-2090958171-project-member] [instance: 21e37459-3ce1-41e7-8317-b98edafb15d4] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 21e37459-3ce1-41e7-8317-b98edafb15d4. [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 667.935826] nova-conductor[52435]: Traceback (most recent call last): [ 667.935826] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 667.935826] nova-conductor[52435]: return func(*args, **kwargs) [ 667.935826] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 667.935826] nova-conductor[52435]: selections = self._select_destinations( [ 667.935826] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 667.935826] nova-conductor[52435]: selections = self._schedule( [ 667.935826] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 667.935826] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 667.935826] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 667.935826] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 667.935826] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.935826] nova-conductor[52435]: ERROR nova.conductor.manager [ 667.943645] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.943952] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.944512] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.000640] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] [instance: 3d11098b-555c-4cbd-8878-0822a22faf0b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 668.001644] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.001870] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.002076] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.007953] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 668.007953] nova-conductor[52435]: Traceback (most recent call last): [ 668.007953] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.007953] nova-conductor[52435]: return func(*args, **kwargs) [ 668.007953] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.007953] nova-conductor[52435]: selections = self._select_destinations( [ 668.007953] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.007953] nova-conductor[52435]: selections = self._schedule( [ 668.007953] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.007953] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 668.007953] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.007953] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 668.007953] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.007953] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.007953] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-788ec1e5-5162-4641-aa68-4ff2bcaac733 tempest-ServersTestMultiNic-2066705161 tempest-ServersTestMultiNic-2066705161-project-member] [instance: 3d11098b-555c-4cbd-8878-0822a22faf0b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.426879] nova-conductor[52436]: Traceback (most recent call last): [ 668.426879] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.426879] nova-conductor[52436]: return func(*args, **kwargs) [ 668.426879] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.426879] nova-conductor[52436]: selections = self._select_destinations( [ 668.426879] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.426879] nova-conductor[52436]: selections = self._schedule( [ 668.426879] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.426879] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 668.426879] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.426879] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 668.426879] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.426879] nova-conductor[52436]: ERROR nova.conductor.manager [ 668.439700] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.439700] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.441512] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.495468] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] [instance: d4ba25c9-fca4-4661-9875-d925c09cbdcf] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 668.496227] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.496442] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.496610] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.499946] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 668.499946] nova-conductor[52436]: Traceback (most recent call last): [ 668.499946] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 668.499946] nova-conductor[52436]: return func(*args, **kwargs) [ 668.499946] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 668.499946] nova-conductor[52436]: selections = self._select_destinations( [ 668.499946] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 668.499946] nova-conductor[52436]: selections = self._schedule( [ 668.499946] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 668.499946] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 668.499946] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 668.499946] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 668.499946] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 668.499946] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 668.500444] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6e7562f0-e2f7-486d-bd9c-cd1c05590c6a tempest-ServerMetadataNegativeTestJSON-1931250371 tempest-ServerMetadataNegativeTestJSON-1931250371-project-member] [instance: d4ba25c9-fca4-4661-9875-d925c09cbdcf] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 669.953073] nova-conductor[52435]: Traceback (most recent call last): [ 669.953073] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 669.953073] nova-conductor[52435]: return func(*args, **kwargs) [ 669.953073] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 669.953073] nova-conductor[52435]: selections = self._select_destinations( [ 669.953073] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 669.953073] nova-conductor[52435]: selections = self._schedule( [ 669.953073] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 669.953073] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 669.953073] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 669.953073] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 669.953073] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.953073] nova-conductor[52435]: ERROR nova.conductor.manager [ 669.961559] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.961798] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.961957] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.004323] nova-conductor[52436]: Traceback (most recent call last): [ 670.004323] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.004323] nova-conductor[52436]: return func(*args, **kwargs) [ 670.004323] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.004323] nova-conductor[52436]: selections = self._select_destinations( [ 670.004323] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.004323] nova-conductor[52436]: selections = self._schedule( [ 670.004323] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.004323] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 670.004323] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.004323] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 670.004323] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.004323] nova-conductor[52436]: ERROR nova.conductor.manager [ 670.012979] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.013394] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.013956] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.033867] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] [instance: 9b993bc5-cd0d-45e3-9fa7-4439269be3df] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 670.035090] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.035354] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.035557] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.042716] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 670.042716] nova-conductor[52435]: Traceback (most recent call last): [ 670.042716] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.042716] nova-conductor[52435]: return func(*args, **kwargs) [ 670.042716] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.042716] nova-conductor[52435]: selections = self._select_destinations( [ 670.042716] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.042716] nova-conductor[52435]: selections = self._schedule( [ 670.042716] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.042716] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 670.042716] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.042716] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 670.042716] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.042716] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.043302] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-65ba9ed4-3fcb-494b-8fda-6443ee1747cf tempest-ServerMetadataTestJSON-384392309 tempest-ServerMetadataTestJSON-384392309-project-member] [instance: 9b993bc5-cd0d-45e3-9fa7-4439269be3df] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.076939] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 19055e4d-3f21-4aee-b505-925ba3345c1d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 670.077989] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.078272] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.078451] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.081708] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 670.081708] nova-conductor[52436]: Traceback (most recent call last): [ 670.081708] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.081708] nova-conductor[52436]: return func(*args, **kwargs) [ 670.081708] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.081708] nova-conductor[52436]: selections = self._select_destinations( [ 670.081708] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.081708] nova-conductor[52436]: selections = self._schedule( [ 670.081708] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.081708] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 670.081708] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.081708] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 670.081708] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.081708] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.082270] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-5e09a2c6-7b3a-4ac2-8630-6039246876a5 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 19055e4d-3f21-4aee-b505-925ba3345c1d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.987873] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.988175] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.988285] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 671.113219] nova-conductor[52435]: Traceback (most recent call last): [ 671.113219] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 671.113219] nova-conductor[52435]: return func(*args, **kwargs) [ 671.113219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 671.113219] nova-conductor[52435]: selections = self._select_destinations( [ 671.113219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 671.113219] nova-conductor[52435]: selections = self._schedule( [ 671.113219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 671.113219] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 671.113219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 671.113219] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 671.113219] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.113219] nova-conductor[52435]: ERROR nova.conductor.manager [ 671.124637] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.124862] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.125082] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.205966] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] [instance: 44d4750f-ceea-4c0b-b809-216fd0764a7f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 671.205966] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.205966] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.205966] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 671.212203] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 671.212203] nova-conductor[52435]: Traceback (most recent call last): [ 671.212203] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 671.212203] nova-conductor[52435]: return func(*args, **kwargs) [ 671.212203] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 671.212203] nova-conductor[52435]: selections = self._select_destinations( [ 671.212203] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 671.212203] nova-conductor[52435]: selections = self._schedule( [ 671.212203] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 671.212203] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 671.212203] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 671.212203] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 671.212203] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 671.212203] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 671.212203] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-0a17b3db-dfd5-4dbe-bfb0-9577e6a2c1d5 tempest-ServerGroupTestJSON-1199456404 tempest-ServerGroupTestJSON-1199456404-project-member] [instance: 44d4750f-ceea-4c0b-b809-216fd0764a7f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.001990] nova-conductor[52436]: Traceback (most recent call last): [ 675.001990] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 675.001990] nova-conductor[52436]: return func(*args, **kwargs) [ 675.001990] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 675.001990] nova-conductor[52436]: selections = self._select_destinations( [ 675.001990] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 675.001990] nova-conductor[52436]: selections = self._schedule( [ 675.001990] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 675.001990] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 675.001990] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 675.001990] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 675.001990] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.001990] nova-conductor[52436]: ERROR nova.conductor.manager [ 675.009148] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.009374] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.009547] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.086419] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] [instance: 4d31e50e-8770-4509-af35-9c7666c23c8b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 675.087180] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.087390] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.087554] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.092222] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 675.092222] nova-conductor[52436]: Traceback (most recent call last): [ 675.092222] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 675.092222] nova-conductor[52436]: return func(*args, **kwargs) [ 675.092222] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 675.092222] nova-conductor[52436]: selections = self._select_destinations( [ 675.092222] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 675.092222] nova-conductor[52436]: selections = self._schedule( [ 675.092222] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 675.092222] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 675.092222] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 675.092222] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 675.092222] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 675.092222] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.092927] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6765486c-84d9-4c58-aba0-a4ae1e55eb3e tempest-ServerAddressesNegativeTestJSON-854539167 tempest-ServerAddressesNegativeTestJSON-854539167-project-member] [instance: 4d31e50e-8770-4509-af35-9c7666c23c8b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.727431] nova-conductor[52435]: Traceback (most recent call last): [ 676.727431] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.727431] nova-conductor[52435]: return func(*args, **kwargs) [ 676.727431] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.727431] nova-conductor[52435]: selections = self._select_destinations( [ 676.727431] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.727431] nova-conductor[52435]: selections = self._schedule( [ 676.727431] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.727431] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 676.727431] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.727431] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 676.727431] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.727431] nova-conductor[52435]: ERROR nova.conductor.manager [ 676.739649] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.739868] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.740156] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.784520] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] [instance: f2071807-f1ec-42bd-9f52-5eca825e59d2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 676.785246] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.785458] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.785626] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.789104] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 676.789104] nova-conductor[52435]: Traceback (most recent call last): [ 676.789104] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.789104] nova-conductor[52435]: return func(*args, **kwargs) [ 676.789104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.789104] nova-conductor[52435]: selections = self._select_destinations( [ 676.789104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.789104] nova-conductor[52435]: selections = self._schedule( [ 676.789104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.789104] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 676.789104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.789104] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 676.789104] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.789104] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.789698] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] [instance: f2071807-f1ec-42bd-9f52-5eca825e59d2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.812557] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.812798] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.812969] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.852818] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] [instance: bc9618cf-02fe-45d3-8bd2-39b2b0635a95] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 676.853527] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.853863] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.853981] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.858803] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 676.858803] nova-conductor[52435]: Traceback (most recent call last): [ 676.858803] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.858803] nova-conductor[52435]: return func(*args, **kwargs) [ 676.858803] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.858803] nova-conductor[52435]: selections = self._select_destinations( [ 676.858803] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.858803] nova-conductor[52435]: selections = self._schedule( [ 676.858803] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.858803] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 676.858803] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.858803] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 676.858803] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.858803] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.859332] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] [instance: bc9618cf-02fe-45d3-8bd2-39b2b0635a95] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.885841] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.885841] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.885841] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.929782] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] [instance: ea8819b2-776d-42ca-aed5-efeb2e362d74] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 676.930414] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.930674] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.930868] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.933777] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 676.933777] nova-conductor[52435]: Traceback (most recent call last): [ 676.933777] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.933777] nova-conductor[52435]: return func(*args, **kwargs) [ 676.933777] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.933777] nova-conductor[52435]: selections = self._select_destinations( [ 676.933777] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.933777] nova-conductor[52435]: selections = self._schedule( [ 676.933777] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.933777] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 676.933777] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.933777] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 676.933777] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.933777] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.934354] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dd650ef2-0a0b-400b-8667-d29749a52938 tempest-ListServersNegativeTestJSON-1742549799 tempest-ListServersNegativeTestJSON-1742549799-project-member] [instance: ea8819b2-776d-42ca-aed5-efeb2e362d74] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.175595] nova-conductor[52436]: Traceback (most recent call last): [ 677.175595] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.175595] nova-conductor[52436]: return func(*args, **kwargs) [ 677.175595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.175595] nova-conductor[52436]: selections = self._select_destinations( [ 677.175595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.175595] nova-conductor[52436]: selections = self._schedule( [ 677.175595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.175595] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 677.175595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.175595] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 677.175595] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.175595] nova-conductor[52436]: ERROR nova.conductor.manager [ 677.185203] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.186240] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.186240] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.262023] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] [instance: b4e392ba-2c92-4f9d-9997-2fc133559b81] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 677.262023] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.262023] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.262023] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.264675] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 677.264675] nova-conductor[52436]: Traceback (most recent call last): [ 677.264675] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.264675] nova-conductor[52436]: return func(*args, **kwargs) [ 677.264675] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.264675] nova-conductor[52436]: selections = self._select_destinations( [ 677.264675] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.264675] nova-conductor[52436]: selections = self._schedule( [ 677.264675] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.264675] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 677.264675] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.264675] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 677.264675] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.264675] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.265265] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-d33b2a99-9e99-4c75-8fad-588db2b7bc29 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] [instance: b4e392ba-2c92-4f9d-9997-2fc133559b81] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 681.074167] nova-conductor[52435]: Traceback (most recent call last): [ 681.074167] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 681.074167] nova-conductor[52435]: return func(*args, **kwargs) [ 681.074167] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 681.074167] nova-conductor[52435]: selections = self._select_destinations( [ 681.074167] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 681.074167] nova-conductor[52435]: selections = self._schedule( [ 681.074167] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 681.074167] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 681.074167] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 681.074167] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 681.074167] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.074167] nova-conductor[52435]: ERROR nova.conductor.manager [ 681.081173] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.081403] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.081585] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.147271] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] [instance: 959e648a-8d1c-4453-988d-cd405f43be58] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 681.148666] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.148952] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.149095] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.152956] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 681.152956] nova-conductor[52435]: Traceback (most recent call last): [ 681.152956] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 681.152956] nova-conductor[52435]: return func(*args, **kwargs) [ 681.152956] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 681.152956] nova-conductor[52435]: selections = self._select_destinations( [ 681.152956] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 681.152956] nova-conductor[52435]: selections = self._schedule( [ 681.152956] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 681.152956] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 681.152956] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 681.152956] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 681.152956] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 681.152956] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 681.153497] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-d34d37c0-f3a5-4ee4-9540-0c7dc3643882 tempest-ServersTestJSON-1611326517 tempest-ServersTestJSON-1611326517-project-member] [instance: 959e648a-8d1c-4453-988d-cd405f43be58] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 683.326498] nova-conductor[52436]: Traceback (most recent call last): [ 683.326498] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 683.326498] nova-conductor[52436]: return func(*args, **kwargs) [ 683.326498] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 683.326498] nova-conductor[52436]: selections = self._select_destinations( [ 683.326498] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 683.326498] nova-conductor[52436]: selections = self._schedule( [ 683.326498] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 683.326498] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 683.326498] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 683.326498] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 683.326498] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.326498] nova-conductor[52436]: ERROR nova.conductor.manager [ 683.341801] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.343815] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.344098] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.389904] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] [instance: 1133fdb1-446b-4f7a-8594-23dd35a35b78] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 683.389904] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.389904] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.389904] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.391842] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 683.391842] nova-conductor[52436]: Traceback (most recent call last): [ 683.391842] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 683.391842] nova-conductor[52436]: return func(*args, **kwargs) [ 683.391842] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 683.391842] nova-conductor[52436]: selections = self._select_destinations( [ 683.391842] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 683.391842] nova-conductor[52436]: selections = self._schedule( [ 683.391842] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 683.391842] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 683.391842] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 683.391842] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 683.391842] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 683.391842] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 683.392940] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9fc3964e-fff0-4513-ac48-480751e1d772 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] [instance: 1133fdb1-446b-4f7a-8594-23dd35a35b78] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.444678] nova-conductor[52435]: Traceback (most recent call last): [ 684.444678] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 684.444678] nova-conductor[52435]: return func(*args, **kwargs) [ 684.444678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 684.444678] nova-conductor[52435]: selections = self._select_destinations( [ 684.444678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 684.444678] nova-conductor[52435]: selections = self._schedule( [ 684.444678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 684.444678] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 684.444678] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 684.444678] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 684.444678] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.444678] nova-conductor[52435]: ERROR nova.conductor.manager [ 684.452296] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.452504] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.452684] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.517597] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] [instance: e91125dd-2352-4914-a269-e98db0479dc1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 684.517597] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.517846] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.518590] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.523373] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 684.523373] nova-conductor[52435]: Traceback (most recent call last): [ 684.523373] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 684.523373] nova-conductor[52435]: return func(*args, **kwargs) [ 684.523373] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 684.523373] nova-conductor[52435]: selections = self._select_destinations( [ 684.523373] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 684.523373] nova-conductor[52435]: selections = self._schedule( [ 684.523373] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 684.523373] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 684.523373] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 684.523373] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 684.523373] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 684.523373] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 684.523990] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-45faf100-3da9-4af5-b020-26ec15714260 tempest-ServerRescueNegativeTestJSON-1122600575 tempest-ServerRescueNegativeTestJSON-1122600575-project-member] [instance: e91125dd-2352-4914-a269-e98db0479dc1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.269669] nova-conductor[52435]: Traceback (most recent call last): [ 688.269669] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.269669] nova-conductor[52435]: return func(*args, **kwargs) [ 688.269669] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.269669] nova-conductor[52435]: selections = self._select_destinations( [ 688.269669] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.269669] nova-conductor[52435]: selections = self._schedule( [ 688.269669] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.269669] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 688.269669] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.269669] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 688.269669] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.269669] nova-conductor[52435]: ERROR nova.conductor.manager [ 688.278041] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.278041] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.278041] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.328789] nova-conductor[52436]: Traceback (most recent call last): [ 688.328789] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.328789] nova-conductor[52436]: return func(*args, **kwargs) [ 688.328789] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.328789] nova-conductor[52436]: selections = self._select_destinations( [ 688.328789] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.328789] nova-conductor[52436]: selections = self._schedule( [ 688.328789] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.328789] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 688.328789] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.328789] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 688.328789] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.328789] nova-conductor[52436]: ERROR nova.conductor.manager [ 688.336105] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.336619] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.336619] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.373409] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] [instance: f4d61950-5a7c-45bf-b93e-c2e80530fdde] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 688.375827] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.375827] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.375827] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.380818] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 688.380818] nova-conductor[52435]: Traceback (most recent call last): [ 688.380818] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.380818] nova-conductor[52435]: return func(*args, **kwargs) [ 688.380818] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.380818] nova-conductor[52435]: selections = self._select_destinations( [ 688.380818] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.380818] nova-conductor[52435]: selections = self._schedule( [ 688.380818] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.380818] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 688.380818] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.380818] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 688.380818] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.380818] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.384149] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-36351400-2291-4cfa-a130-bb1f920f2533 tempest-ServerActionsTestOtherB-121839142 tempest-ServerActionsTestOtherB-121839142-project-member] [instance: f4d61950-5a7c-45bf-b93e-c2e80530fdde] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.399691] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: a5ec11ef-3476-44e0-b3d7-7a2b56f3b279] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 688.399691] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.399691] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.399691] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.403122] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 688.403122] nova-conductor[52436]: Traceback (most recent call last): [ 688.403122] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 688.403122] nova-conductor[52436]: return func(*args, **kwargs) [ 688.403122] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 688.403122] nova-conductor[52436]: selections = self._select_destinations( [ 688.403122] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 688.403122] nova-conductor[52436]: selections = self._schedule( [ 688.403122] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 688.403122] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 688.403122] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 688.403122] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 688.403122] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 688.403122] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 688.404187] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-07b3546e-4521-464d-8f2a-a476d772cc7d tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: a5ec11ef-3476-44e0-b3d7-7a2b56f3b279] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.286659] nova-conductor[52435]: Traceback (most recent call last): [ 689.286659] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.286659] nova-conductor[52435]: return func(*args, **kwargs) [ 689.286659] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.286659] nova-conductor[52435]: selections = self._select_destinations( [ 689.286659] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.286659] nova-conductor[52435]: selections = self._schedule( [ 689.286659] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.286659] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 689.286659] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.286659] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 689.286659] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.286659] nova-conductor[52435]: ERROR nova.conductor.manager [ 689.295916] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.296308] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.296572] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.351122] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 49e3e619-7fa6-4dee-a447-b72668fca5d4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 689.351894] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.352132] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.352347] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.355374] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 689.355374] nova-conductor[52435]: Traceback (most recent call last): [ 689.355374] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 689.355374] nova-conductor[52435]: return func(*args, **kwargs) [ 689.355374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 689.355374] nova-conductor[52435]: selections = self._select_destinations( [ 689.355374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 689.355374] nova-conductor[52435]: selections = self._schedule( [ 689.355374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 689.355374] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 689.355374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 689.355374] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 689.355374] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 689.355374] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 689.355881] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-a2db66a2-09bb-4cb0-a9a6-be3b396adc76 tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 49e3e619-7fa6-4dee-a447-b72668fca5d4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.699588] nova-conductor[52436]: Traceback (most recent call last): [ 690.699588] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.699588] nova-conductor[52436]: return func(*args, **kwargs) [ 690.699588] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.699588] nova-conductor[52436]: selections = self._select_destinations( [ 690.699588] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.699588] nova-conductor[52436]: selections = self._schedule( [ 690.699588] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.699588] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 690.699588] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.699588] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 690.699588] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.699588] nova-conductor[52436]: ERROR nova.conductor.manager [ 690.708520] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.709733] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.709960] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.770981] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] [instance: c537a7d4-6618-4cfa-afd5-209b99692d65] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 690.772250] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.772464] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.772632] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.776516] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 690.776516] nova-conductor[52436]: Traceback (most recent call last): [ 690.776516] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 690.776516] nova-conductor[52436]: return func(*args, **kwargs) [ 690.776516] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 690.776516] nova-conductor[52436]: selections = self._select_destinations( [ 690.776516] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 690.776516] nova-conductor[52436]: selections = self._schedule( [ 690.776516] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 690.776516] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 690.776516] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 690.776516] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 690.776516] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 690.776516] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 690.777272] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-0ec03996-4122-4954-93d2-c98ecdb0db25 tempest-ServerShowV247Test-556280072 tempest-ServerShowV247Test-556280072-project-member] [instance: c537a7d4-6618-4cfa-afd5-209b99692d65] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 693.888315] nova-conductor[52435]: Traceback (most recent call last): [ 693.888315] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 693.888315] nova-conductor[52435]: return func(*args, **kwargs) [ 693.888315] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 693.888315] nova-conductor[52435]: selections = self._select_destinations( [ 693.888315] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 693.888315] nova-conductor[52435]: selections = self._schedule( [ 693.888315] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 693.888315] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 693.888315] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 693.888315] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 693.888315] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.888315] nova-conductor[52435]: ERROR nova.conductor.manager [ 693.903927] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.904273] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.904496] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.979321] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] [instance: 56ff71b7-3b44-4860-8ce8-8b75757a569c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 693.980108] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.980329] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.980840] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.984593] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 693.984593] nova-conductor[52435]: Traceback (most recent call last): [ 693.984593] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 693.984593] nova-conductor[52435]: return func(*args, **kwargs) [ 693.984593] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 693.984593] nova-conductor[52435]: selections = self._select_destinations( [ 693.984593] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 693.984593] nova-conductor[52435]: selections = self._schedule( [ 693.984593] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 693.984593] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 693.984593] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 693.984593] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 693.984593] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 693.984593] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 693.984593] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-6d337fe4-b5d0-4d83-a8ef-8b2decf79164 tempest-ImagesOneServerTestJSON-66817639 tempest-ImagesOneServerTestJSON-66817639-project-member] [instance: 56ff71b7-3b44-4860-8ce8-8b75757a569c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.339647] nova-conductor[52436]: Traceback (most recent call last): [ 695.339647] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 695.339647] nova-conductor[52436]: return func(*args, **kwargs) [ 695.339647] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 695.339647] nova-conductor[52436]: selections = self._select_destinations( [ 695.339647] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 695.339647] nova-conductor[52436]: selections = self._schedule( [ 695.339647] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 695.339647] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 695.339647] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 695.339647] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 695.339647] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.339647] nova-conductor[52436]: ERROR nova.conductor.manager [ 695.346432] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.346735] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.346924] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.400429] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] [instance: 39a71a25-3fb7-4fbf-a778-a3fe135dc3c1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 695.400429] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.400429] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.400429] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.404356] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 695.404356] nova-conductor[52436]: Traceback (most recent call last): [ 695.404356] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 695.404356] nova-conductor[52436]: return func(*args, **kwargs) [ 695.404356] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 695.404356] nova-conductor[52436]: selections = self._select_destinations( [ 695.404356] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 695.404356] nova-conductor[52436]: selections = self._schedule( [ 695.404356] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 695.404356] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 695.404356] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 695.404356] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 695.404356] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 695.404356] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 695.405377] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-a31e40b8-7003-4f1a-9167-3faca6b641b6 tempest-FloatingIPsAssociationTestJSON-489888283 tempest-FloatingIPsAssociationTestJSON-489888283-project-member] [instance: 39a71a25-3fb7-4fbf-a778-a3fe135dc3c1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 696.512868] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 696.526974] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.527442] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.527442] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.561107] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.561107] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.561107] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.561107] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.561107] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.561107] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.573865] nova-conductor[52436]: DEBUG nova.quota [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Getting quotas for project 38f2b7c76cb04e49b1b8ac75980011b2. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 696.577534] nova-conductor[52436]: DEBUG nova.quota [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Getting quotas for user 9a2e115241bf4f4491e4736c14c8c75f and project 38f2b7c76cb04e49b1b8ac75980011b2. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 696.584138] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 696.584599] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.584799] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.584961] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.592756] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 696.593458] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.593657] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.593841] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.614989] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.616206] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.616206] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.283588] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 700.296941] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.296941] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.297312] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.347750] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.347964] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.348153] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.348512] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.348691] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.348845] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.360585] nova-conductor[52435]: DEBUG nova.quota [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Getting quotas for project 0ca4a4d35bf24c4888af9ba6b2f4717b. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 700.366258] nova-conductor[52435]: DEBUG nova.quota [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Getting quotas for user 5ed904e8ae1e4c3093ea78c668aa6573 and project 0ca4a4d35bf24c4888af9ba6b2f4717b. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 700.372487] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 700.372808] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.373072] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.373177] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.378522] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 700.379660] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.379660] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.379772] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.395226] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.395578] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.395792] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.492266] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 700.504535] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.504752] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.504917] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.535886] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.535886] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.536103] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.536517] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.537625] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.537625] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.546112] nova-conductor[52436]: DEBUG nova.quota [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Getting quotas for project 6448dab5c83b44e48c3ea2bd37691788. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 700.548560] nova-conductor[52436]: DEBUG nova.quota [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Getting quotas for user 4f78185aeccd4e96b19c49aa985f446d and project 6448dab5c83b44e48c3ea2bd37691788. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 700.560496] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 700.560496] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.560496] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.560496] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.564024] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 700.564814] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.565073] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.566096] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.583034] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.583214] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.583419] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.303456] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 702.318706] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.319096] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.319386] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.364618] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.364618] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.364618] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.364618] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.364618] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.364618] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.372432] nova-conductor[52435]: DEBUG nova.quota [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 702.376833] nova-conductor[52435]: DEBUG nova.quota [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for user 9ae0c3fdf5814c20819e4329e87733e3 and project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 702.381700] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 702.383586] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.383586] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.383586] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.385924] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 702.386640] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.386842] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.387015] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.407293] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.407521] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.407684] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.528021] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 705.542073] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.542270] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.542863] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.582743] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.582970] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.583435] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.583516] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.583972] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.583972] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.594944] nova-conductor[52436]: DEBUG nova.quota [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Getting quotas for project 3a9c7dc4e938463781dbded4e5382286. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 705.596550] nova-conductor[52436]: DEBUG nova.quota [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Getting quotas for user 9f30a9b74cf54f4dbc2aa343e2b3298e and project 3a9c7dc4e938463781dbded4e5382286. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 705.603939] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 705.604463] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.604674] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.604842] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.608304] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 705.609062] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.609272] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.609438] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.633983] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.634333] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.634424] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.896667] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 705.912967] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.913254] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.913448] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.949355] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.949627] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.949875] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.950252] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.950729] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.950908] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.967101] nova-conductor[52436]: DEBUG nova.quota [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting quotas for project a9e33b2e4b8c439a8e8a557ddda22fce. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 705.972893] nova-conductor[52436]: DEBUG nova.quota [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting quotas for user baef766334764dd9ab481d3a2aacd07b and project a9e33b2e4b8c439a8e8a557ddda22fce. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 705.981424] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 705.981424] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.981556] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.981741] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.990667] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 705.991386] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.991587] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.991940] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.006028] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.006028] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.006028] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.740405] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 706.758761] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.758994] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.759209] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.800884] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.801140] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.801409] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.801781] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.801967] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.802140] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.810631] nova-conductor[52435]: DEBUG nova.quota [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Getting quotas for project 3794741ad1ea4973b8fbab68114f28c3. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 706.813549] nova-conductor[52435]: DEBUG nova.quota [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Getting quotas for user 3f6fc368c3264c3b902fa4539548cb86 and project 3794741ad1ea4973b8fbab68114f28c3. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 706.821472] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 706.821970] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.822748] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.822748] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.825176] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 706.825810] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.826337] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.826337] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.841273] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.841492] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.841663] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.501021] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 67b01666-6233-4af8-a0ec-a4e938b82606 was re-scheduled: Binding failed for port f4843203-5a26-458c-986f-a4c59da7d9c3, please check neutron logs for more information.\n'] [ 708.501330] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 708.501595] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67b01666-6233-4af8-a0ec-a4e938b82606.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67b01666-6233-4af8-a0ec-a4e938b82606. [ 708.501898] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67b01666-6233-4af8-a0ec-a4e938b82606. [ 708.527898] nova-conductor[52436]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 708.772121] nova-conductor[52436]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.773721] nova-conductor[52436]: DEBUG nova.network.neutron [None req-5fcdf030-5581-4c31-b555-6f450d6c557c tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 67b01666-6233-4af8-a0ec-a4e938b82606] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.405024] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 709.417388] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.417620] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.417788] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.448724] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.449008] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.449196] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.449608] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.449761] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.449920] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.458818] nova-conductor[52435]: DEBUG nova.quota [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Getting quotas for project 8349984ea81841b8880696d0a1326b35. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 709.461659] nova-conductor[52435]: DEBUG nova.quota [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Getting quotas for user 4be8262f30274ebb9516f2ec280a6a40 and project 8349984ea81841b8880696d0a1326b35. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 709.467813] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 709.468333] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.468572] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.468751] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.471874] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 709.472232] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.472430] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.472595] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.484954] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.485267] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.485476] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.784382] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 711.797920] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.798162] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.798328] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.848607] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.848607] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.848607] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.848607] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.848607] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.848607] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.860624] nova-conductor[52435]: DEBUG nova.quota [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Getting quotas for project 257f2e74ff3341ffbc4982cb07c324fa. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 711.863815] nova-conductor[52435]: DEBUG nova.quota [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Getting quotas for user 108982ac95244915883daf6a6b4b7f35 and project 257f2e74ff3341ffbc4982cb07c324fa. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 711.872913] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 711.873440] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.873653] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.873833] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.879708] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 711.879708] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.879708] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.879708] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.891639] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.891862] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.892460] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.911998] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 37331817-f277-4f32-8d5a-11e1cf63f2b7 was re-scheduled: Binding failed for port b8a09541-861d-4a6e-bed9-a46fc8baadf6, please check neutron logs for more information.\n'] [ 711.913085] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 711.913337] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 37331817-f277-4f32-8d5a-11e1cf63f2b7.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 37331817-f277-4f32-8d5a-11e1cf63f2b7. [ 711.913763] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 37331817-f277-4f32-8d5a-11e1cf63f2b7. [ 711.937922] nova-conductor[52435]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 712.113318] nova-conductor[52435]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 712.116726] nova-conductor[52435]: DEBUG nova.network.neutron [None req-1f805b44-4a24-4176-9eaa-856adf31a9c4 tempest-ServerRescueTestJSONUnderV235-1094407707 tempest-ServerRescueTestJSONUnderV235-1094407707-project-member] [instance: 37331817-f277-4f32-8d5a-11e1cf63f2b7] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.220491] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 5b59d527-232e-4ef1-bc83-4e8671607db1 was re-scheduled: Binding failed for port 3fc4c3ad-2447-45d8-941c-973977c7c5b9, please check neutron logs for more information.\n'] [ 713.221081] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 713.221284] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5b59d527-232e-4ef1-bc83-4e8671607db1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5b59d527-232e-4ef1-bc83-4e8671607db1. [ 713.221493] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5b59d527-232e-4ef1-bc83-4e8671607db1. [ 713.256205] nova-conductor[52435]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.352121] nova-conductor[52435]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.355909] nova-conductor[52435]: DEBUG nova.network.neutron [None req-842661e4-47db-40b6-a177-abdbb515f635 tempest-SecurityGroupsTestJSON-1410390636 tempest-SecurityGroupsTestJSON-1410390636-project-member] [instance: 5b59d527-232e-4ef1-bc83-4e8671607db1] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.383998] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 713.400773] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.401008] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.401187] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.433321] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.433545] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.433711] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.434068] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.434257] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.434419] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.445538] nova-conductor[52436]: DEBUG nova.quota [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Getting quotas for project 6f4d626262ba4bdf905536d0a5919b61. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 713.448298] nova-conductor[52436]: DEBUG nova.quota [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Getting quotas for user 0f2d798fa67b4212ba9b0cba90b00820 and project 6f4d626262ba4bdf905536d0a5919b61. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 713.456536] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 713.456969] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.457188] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.457396] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.460317] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 713.460966] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.461184] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.461352] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.473785] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.473995] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.474178] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.253669] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 716.264785] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.265099] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.265314] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.295464] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.295756] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.295961] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.296451] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.296673] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.296883] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.305257] nova-conductor[52435]: DEBUG nova.quota [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Getting quotas for project b868df8009cd4d07b50856808eceb007. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 716.307613] nova-conductor[52435]: DEBUG nova.quota [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Getting quotas for user 816a348b026b4041b8e75233830d4736 and project b868df8009cd4d07b50856808eceb007. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 716.313188] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 716.313697] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.313938] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.314157] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.319443] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 716.320121] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.320426] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.320494] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.333160] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.333532] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.333759] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.990494] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 29c85bbf-553e-4b82-ad7c-5341ffc5af63 was re-scheduled: Binding failed for port 879190bd-d68a-455e-abd2-73e8f85c3e28, please check neutron logs for more information.\n'] [ 716.991192] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 716.991437] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 29c85bbf-553e-4b82-ad7c-5341ffc5af63.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 29c85bbf-553e-4b82-ad7c-5341ffc5af63. [ 716.991747] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 29c85bbf-553e-4b82-ad7c-5341ffc5af63. [ 717.043925] nova-conductor[52435]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.160409] nova-conductor[52435]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.164731] nova-conductor[52435]: DEBUG nova.network.neutron [None req-5982f57f-94e2-4ede-9a1d-e83cf71740be tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 29c85bbf-553e-4b82-ad7c-5341ffc5af63] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.174016] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance b7ab8792-137d-4053-9df9-3d560aa5e411 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 717.175437] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 717.175679] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b7ab8792-137d-4053-9df9-3d560aa5e411.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b7ab8792-137d-4053-9df9-3d560aa5e411. [ 717.175990] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-7433848e-4986-4c55-8422-99a785e77d8e tempest-ServerShowV254Test-1863856496 tempest-ServerShowV254Test-1863856496-project-member] [instance: b7ab8792-137d-4053-9df9-3d560aa5e411] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b7ab8792-137d-4053-9df9-3d560aa5e411. [ 717.609832] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8 was re-scheduled: Binding failed for port 301ba51c-e66d-4f6a-b589-40341a138ddf, please check neutron logs for more information.\n'] [ 717.610395] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 717.610645] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8. [ 717.610910] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8. [ 717.636089] nova-conductor[52436]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.820858] nova-conductor[52436]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.824993] nova-conductor[52436]: DEBUG nova.network.neutron [None req-165b5755-201c-4c41-b1f1-5918d30a4aa7 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 5ad1fabc-bae4-47cb-9b27-42c86c4b02e8] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.624334] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 6576c530-0b88-453e-bace-70a4f1c76d3c was re-scheduled: Binding failed for port ef85a30e-1521-4200-a4af-6db739713904, please check neutron logs for more information.\n'] [ 719.625957] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 719.625957] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6576c530-0b88-453e-bace-70a4f1c76d3c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6576c530-0b88-453e-bace-70a4f1c76d3c. [ 719.625957] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 6576c530-0b88-453e-bace-70a4f1c76d3c. [ 719.653635] nova-conductor[52435]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 719.938235] nova-conductor[52435]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.949414] nova-conductor[52435]: DEBUG nova.network.neutron [None req-94b330c4-ecff-4ffd-bb83-bde2e7c2fd79 tempest-InstanceActionsNegativeTestJSON-638305230 tempest-InstanceActionsNegativeTestJSON-638305230-project-member] [instance: 6576c530-0b88-453e-bace-70a4f1c76d3c] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.097498] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance db69e0d0-724b-4a87-80f5-390cfc395ee9 was re-scheduled: Binding failed for port 8c4b7ea2-a0f8-4f3d-a4c9-38b3e19ff1f8, please check neutron logs for more information.\n'] [ 722.097498] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 722.097498] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance db69e0d0-724b-4a87-80f5-390cfc395ee9.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance db69e0d0-724b-4a87-80f5-390cfc395ee9. [ 722.097856] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance db69e0d0-724b-4a87-80f5-390cfc395ee9. [ 722.122166] nova-conductor[52436]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 722.180758] nova-conductor[52436]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.189260] nova-conductor[52436]: DEBUG nova.network.neutron [None req-789599d9-f56b-442a-8a42-a313d99e721a tempest-AttachInterfacesV270Test-1791144961 tempest-AttachInterfacesV270Test-1791144961-project-member] [instance: db69e0d0-724b-4a87-80f5-390cfc395ee9] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.047156] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 40115f76-28d8-4f39-9dca-59401f52f22f was re-scheduled: Binding failed for port b475fa57-76c3-4f1c-a1bf-fd6c13cd4193, please check neutron logs for more information.\n'] [ 723.049866] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 723.049866] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 40115f76-28d8-4f39-9dca-59401f52f22f.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 40115f76-28d8-4f39-9dca-59401f52f22f. [ 723.049866] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 40115f76-28d8-4f39-9dca-59401f52f22f. [ 723.075950] nova-conductor[52436]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 723.157546] nova-conductor[52436]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.161093] nova-conductor[52436]: DEBUG nova.network.neutron [None req-351a22bd-9494-43d8-8f43-a244c82dba0a tempest-DeleteServersAdminTestJSON-14307472 tempest-DeleteServersAdminTestJSON-14307472-project-member] [instance: 40115f76-28d8-4f39-9dca-59401f52f22f] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.939686] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 723.956804] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.957244] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.957566] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.987744] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.988352] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.988352] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.988522] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.988761] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.989492] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.997946] nova-conductor[52436]: DEBUG nova.quota [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 724.000371] nova-conductor[52436]: DEBUG nova.quota [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for user 9ae0c3fdf5814c20819e4329e87733e3 and project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 724.008235] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 724.008235] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.008235] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.008235] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.009780] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 724.010467] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.010666] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.011094] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.027492] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.027736] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.028113] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.455901] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 724.471018] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.471239] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.471409] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.537919] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.538158] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.538330] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.538926] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.538926] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.539059] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.549415] nova-conductor[52435]: DEBUG nova.quota [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting quotas for project a9e33b2e4b8c439a8e8a557ddda22fce. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 724.551898] nova-conductor[52435]: DEBUG nova.quota [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting quotas for user baef766334764dd9ab481d3a2aacd07b and project a9e33b2e4b8c439a8e8a557ddda22fce. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 724.558586] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 724.559057] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.559265] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.559431] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.562114] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 724.562754] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.562955] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.563131] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.576023] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.576237] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.576411] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.989075] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 726.001363] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.001627] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.001801] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.061992] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.062250] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.062423] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.062776] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.062989] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.063169] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.072144] nova-conductor[52435]: DEBUG nova.quota [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Getting quotas for project 02855237761b401fbd41810098ed77e9. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 726.074487] nova-conductor[52435]: DEBUG nova.quota [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Getting quotas for user fc2bbf8be07f4b8699a1c2b2813372bb and project 02855237761b401fbd41810098ed77e9. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 726.080928] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 726.081539] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.081707] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.081850] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.086941] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 726.087602] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.087828] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.088012] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.101304] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.101512] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.101679] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.119030] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 727.132431] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.132883] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.133213] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.197299] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.197299] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.197299] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.197299] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.197299] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.197299] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.206131] nova-conductor[52435]: DEBUG nova.quota [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Getting quotas for project 8ff7f4f6340440dfbe17b4c3b7a33c1d. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 727.208909] nova-conductor[52435]: DEBUG nova.quota [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Getting quotas for user 6f3bb547edf647c1a12433acc70091dc and project 8ff7f4f6340440dfbe17b4c3b7a33c1d. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 727.215813] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 727.216370] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.216603] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.216834] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.221663] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 727.222761] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.223118] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.223272] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 727.241973] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.242692] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.244239] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.937803] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 728.952668] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.952668] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.952952] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.008740] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.008740] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.008740] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.008740] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.008740] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.008740] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.019500] nova-conductor[52435]: DEBUG nova.quota [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Getting quotas for project 38f2b7c76cb04e49b1b8ac75980011b2. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 729.022182] nova-conductor[52435]: DEBUG nova.quota [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Getting quotas for user 9a2e115241bf4f4491e4736c14c8c75f and project 38f2b7c76cb04e49b1b8ac75980011b2. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 729.031444] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 729.031982] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.032245] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.032474] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.033325] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 0fc5de88-13a6-498a-848c-35beb772be65 was re-scheduled: Binding failed for port 78685b50-e23c-45ff-8b69-8ac77b285c14, please check neutron logs for more information.\n'] [ 729.034616] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 729.034616] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 0fc5de88-13a6-498a-848c-35beb772be65.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 0fc5de88-13a6-498a-848c-35beb772be65. [ 729.034616] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 0fc5de88-13a6-498a-848c-35beb772be65. [ 729.039842] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 729.040802] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.041057] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.041258] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.060407] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.060696] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.060796] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.063274] nova-conductor[52436]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 729.133752] nova-conductor[52436]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.141997] nova-conductor[52436]: DEBUG nova.network.neutron [None req-c1d11cad-307c-40df-932b-ba04eaee1bd5 tempest-ServersNegativeTestJSON-1843437481 tempest-ServersNegativeTestJSON-1843437481-project-member] [instance: 0fc5de88-13a6-498a-848c-35beb772be65] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.297174] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance c573864b-774a-4e4d-be80-5bc9bbd1659d was re-scheduled: Binding failed for port 2cc98968-ca9f-4493-94eb-91a79482e1f8, please check neutron logs for more information.\n'] [ 729.297917] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 729.298166] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c573864b-774a-4e4d-be80-5bc9bbd1659d.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c573864b-774a-4e4d-be80-5bc9bbd1659d. [ 729.298920] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c573864b-774a-4e4d-be80-5bc9bbd1659d. [ 729.331721] nova-conductor[52435]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 729.391311] nova-conductor[52435]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.395705] nova-conductor[52435]: DEBUG nova.network.neutron [None req-e3feb433-423a-40b9-872b-b16336325131 tempest-ServersTestManualDisk-1748827388 tempest-ServersTestManualDisk-1748827388-project-member] [instance: c573864b-774a-4e4d-be80-5bc9bbd1659d] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.394448] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 72b94cae-d12d-4228-8ca0-20fde3095c38 was re-scheduled: Binding failed for port cea10054-d802-4762-920a-926732dcdf98, please check neutron logs for more information.\n'] [ 731.395831] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 731.396200] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72b94cae-d12d-4228-8ca0-20fde3095c38.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72b94cae-d12d-4228-8ca0-20fde3095c38. [ 731.396671] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72b94cae-d12d-4228-8ca0-20fde3095c38. [ 731.420364] nova-conductor[52436]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.482107] nova-conductor[52436]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.490262] nova-conductor[52436]: DEBUG nova.network.neutron [None req-1ce0733e-98f4-4e73-9f1a-6e131beaf4ff tempest-ServersNegativeTestMultiTenantJSON-1136815640 tempest-ServersNegativeTestMultiTenantJSON-1136815640-project-member] [instance: 72b94cae-d12d-4228-8ca0-20fde3095c38] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.411173] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 8d391d2c-ea85-47d4-a140-03ea6da1c101 was re-scheduled: Binding failed for port 2af06c29-ed0d-4674-acb8-1a3778b201f6, please check neutron logs for more information.\n'] [ 736.412018] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 736.412605] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8d391d2c-ea85-47d4-a140-03ea6da1c101.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8d391d2c-ea85-47d4-a140-03ea6da1c101. [ 736.412605] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8d391d2c-ea85-47d4-a140-03ea6da1c101. [ 736.452669] nova-conductor[52435]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.527222] nova-conductor[52435]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.530331] nova-conductor[52435]: DEBUG nova.network.neutron [None req-a7c0fadb-3571-4334-bb7d-e78c411a7862 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: 8d391d2c-ea85-47d4-a140-03ea6da1c101] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.654509] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 4e0befc8-76e7-484d-957e-55b0aaedc2c6 was re-scheduled: Binding failed for port 5cb065d7-5d51-4bed-96ff-fecc4da6167d, please check neutron logs for more information.\n'] [ 736.655149] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 736.655371] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4e0befc8-76e7-484d-957e-55b0aaedc2c6.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4e0befc8-76e7-484d-957e-55b0aaedc2c6. [ 736.655590] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4e0befc8-76e7-484d-957e-55b0aaedc2c6. [ 736.682231] nova-conductor[52436]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.746700] nova-conductor[52436]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.752460] nova-conductor[52436]: DEBUG nova.network.neutron [None req-f3fae9dc-6421-49a5-8a23-c9e688e54ee2 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e0befc8-76e7-484d-957e-55b0aaedc2c6] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.111481] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 739.124705] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.124930] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.125110] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.147976] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 197488cc-ac6b-4561-8d57-f372c6493573 was re-scheduled: Binding failed for port 573439c9-df7f-4b26-8c67-091dcc6e41d9, please check neutron logs for more information.\n'] [ 739.148565] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 739.148799] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 197488cc-ac6b-4561-8d57-f372c6493573.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 197488cc-ac6b-4561-8d57-f372c6493573. [ 739.149015] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 197488cc-ac6b-4561-8d57-f372c6493573. [ 739.170113] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.170113] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.170113] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.170113] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.170300] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.170300] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.173455] nova-conductor[52436]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 739.182147] nova-conductor[52435]: DEBUG nova.quota [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Getting quotas for project 51cca17e80d947b495de9f644e67bb98. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 739.184797] nova-conductor[52435]: DEBUG nova.quota [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Getting quotas for user d07a8b6181ab41348233feb85133e0a4 and project 51cca17e80d947b495de9f644e67bb98. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 739.190398] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 739.190805] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.191019] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.191190] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.194886] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 739.195306] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.195506] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.195672] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.212810] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.213055] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.213225] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.219993] nova-conductor[52436]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.228926] nova-conductor[52436]: DEBUG nova.network.neutron [None req-b35cc747-764c-4412-8f6e-d8b750bccc22 tempest-AttachInterfacesTestJSON-287887941 tempest-AttachInterfacesTestJSON-287887941-project-member] [instance: 197488cc-ac6b-4561-8d57-f372c6493573] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.997041] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 741.008316] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.008686] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.008987] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.038866] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.039299] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.039761] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.040226] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.040506] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.040834] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.051299] nova-conductor[52436]: DEBUG nova.quota [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Getting quotas for project 6dda5a654e88441fa1c1f01d1435fda8. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 741.053795] nova-conductor[52436]: DEBUG nova.quota [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Getting quotas for user dd08df6035734ed594e1a61adc83f5a1 and project 6dda5a654e88441fa1c1f01d1435fda8. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 741.066791] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 741.066791] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.066791] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.066791] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.069676] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 741.070529] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.070852] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.071166] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.087928] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.088182] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.088352] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.972578] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 949f98a2-9316-4cbd-b1e3-b05d08a68997 was re-scheduled: Binding failed for port 33db9413-9863-4b89-8ca0-4838adac1c47, please check neutron logs for more information.\n'] [ 741.973508] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 741.976370] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 949f98a2-9316-4cbd-b1e3-b05d08a68997.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 949f98a2-9316-4cbd-b1e3-b05d08a68997. [ 741.976370] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 949f98a2-9316-4cbd-b1e3-b05d08a68997. [ 742.004913] nova-conductor[52436]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 742.092169] nova-conductor[52436]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 742.099397] nova-conductor[52436]: DEBUG nova.network.neutron [None req-7ca2c827-b9a2-49cc-a651-983ad4ad11f6 tempest-ServerRescueTestJSON-1441311787 tempest-ServerRescueTestJSON-1441311787-project-member] [instance: 949f98a2-9316-4cbd-b1e3-b05d08a68997] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.234040] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 743.252659] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.252935] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.253116] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.253764] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 743.285550] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.285708] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.286012] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.303032] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.303032] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.303032] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.303321] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.303460] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.304427] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.313196] nova-conductor[52436]: DEBUG nova.quota [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting quotas for project a9e33b2e4b8c439a8e8a557ddda22fce. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 743.316183] nova-conductor[52436]: DEBUG nova.quota [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Getting quotas for user baef766334764dd9ab481d3a2aacd07b and project a9e33b2e4b8c439a8e8a557ddda22fce. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 743.317560] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.317848] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.318077] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.318494] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.318716] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.318903] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.325097] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 743.325872] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.325979] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.327080] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.334315] nova-conductor[52435]: DEBUG nova.quota [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 743.337797] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 743.337933] nova-conductor[52435]: DEBUG nova.quota [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for user 9ae0c3fdf5814c20819e4329e87733e3 and project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 743.337985] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.337985] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.337985] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.343570] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 743.344835] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.345081] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.345263] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.349438] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: bb9c495d-4f2f-4d2a-9af9-4d0ae62fe88d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 743.349575] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.349676] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.349826] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.353117] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.353836] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.353836] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.369441] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.369441] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.369441] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-330226e4-1da1-4ca3-b279-713522d5f42c tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.619731] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 749.632329] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.632573] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.632864] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.678137] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.678137] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.678323] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.678689] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.678916] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.679161] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.688904] nova-conductor[52436]: DEBUG nova.quota [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 749.691674] nova-conductor[52436]: DEBUG nova.quota [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Getting quotas for user 9ae0c3fdf5814c20819e4329e87733e3 and project d742fb05f93f44a9b9c8207f47e77730. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 749.698543] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 749.699243] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.700224] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.700224] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.707804] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 749.707804] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.707804] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.707804] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.720754] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.722415] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.722415] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.340125] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52435) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 751.351790] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.352091] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.352451] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.385689] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.385919] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.386103] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.386456] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.386667] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.386835] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.399103] nova-conductor[52435]: DEBUG nova.quota [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Getting quotas for project c884340032e64abcbc9e405b7da4cb6f. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 751.402053] nova-conductor[52435]: DEBUG nova.quota [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Getting quotas for user 4c92e5f993844571be7b606f48976f9b and project c884340032e64abcbc9e405b7da4cb6f. Resources: {'instances', 'ram', 'cores'} {{(pid=52435) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 751.409840] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52435) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 751.410113] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.410270] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.411441] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.413478] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 751.414167] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.414408] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.414651] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.427645] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.428319] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.428319] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.689467] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b was re-scheduled: Binding failed for port 02752945-ef11-45a3-8c97-693f994af658, please check neutron logs for more information.\n'] [ 751.689720] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 751.689954] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b. [ 751.690274] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance ac74db4e-ee8d-4aab-96bc-b41bc30d371b. [ 751.722473] nova-conductor[52435]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 752.009046] nova-conductor[52435]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 752.022319] nova-conductor[52435]: DEBUG nova.network.neutron [None req-0eeb4a79-86a1-491d-8c66-e215cb3f9c23 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: ac74db4e-ee8d-4aab-96bc-b41bc30d371b] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.543527] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 752.558418] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.558900] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.559095] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.600739] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.600972] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.601195] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.601667] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.601728] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.601873] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.611876] nova-conductor[52436]: DEBUG nova.quota [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Getting quotas for project 27d36caa344f42aca3919c92d468bbd6. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 752.615128] nova-conductor[52436]: DEBUG nova.quota [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Getting quotas for user 18ef365c1e9342b7b0354ea96850399f and project 27d36caa344f42aca3919c92d468bbd6. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 752.625894] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 752.625894] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.625894] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.626206] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.630382] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 752.631025] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.631227] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.631387] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.645879] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.646109] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.646279] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.206021] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 753.226975] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.226975] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.226975] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.282425] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.282630] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.282796] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.283159] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.283340] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.283496] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.294739] nova-conductor[52436]: DEBUG nova.quota [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Getting quotas for project bb0823f51ed044b7ad68386fb1f60fb5. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 753.300706] nova-conductor[52436]: DEBUG nova.quota [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Getting quotas for user 52b81f22ae004f55a58accddbf06b161 and project bb0823f51ed044b7ad68386fb1f60fb5. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 753.308874] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 753.309509] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.309845] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.310053] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.313541] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 753.314948] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.315180] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.315351] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.335565] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.335801] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.335973] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.032821] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 963513ec-2280-475b-87a0-045df892e8b4 was re-scheduled: Binding failed for port 64249b93-1271-4bd9-b7ca-deae2d68e0fd, please check neutron logs for more information.\n'] [ 754.033875] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 754.033945] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 963513ec-2280-475b-87a0-045df892e8b4.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 963513ec-2280-475b-87a0-045df892e8b4. [ 754.034271] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 963513ec-2280-475b-87a0-045df892e8b4. [ 754.070753] nova-conductor[52436]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 754.225177] nova-conductor[52436]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.233295] nova-conductor[52436]: DEBUG nova.network.neutron [None req-1bee8ed7-c957-4e88-9a2a-796e47b7abe1 tempest-ServerPasswordTestJSON-1808313117 tempest-ServerPasswordTestJSON-1808313117-project-member] [instance: 963513ec-2280-475b-87a0-045df892e8b4] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.900487] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance f12524a7-21b9-4e35-b15b-955627d58c7a was re-scheduled: Binding failed for port 67a8229f-3cad-45fd-8c40-d2c3e22b636a, please check neutron logs for more information.\n'] [ 756.900487] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 756.900487] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f12524a7-21b9-4e35-b15b-955627d58c7a.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f12524a7-21b9-4e35-b15b-955627d58c7a. [ 756.900487] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f12524a7-21b9-4e35-b15b-955627d58c7a. [ 756.924723] nova-conductor[52436]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 757.006346] nova-conductor[52436]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.011709] nova-conductor[52436]: DEBUG nova.network.neutron [None req-f5dc14c3-7cfa-42da-b2ca-ce318c544e89 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: f12524a7-21b9-4e35-b15b-955627d58c7a] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 759.351409] nova-conductor[52435]: Traceback (most recent call last): [ 759.351409] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 759.351409] nova-conductor[52435]: return func(*args, **kwargs) [ 759.351409] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 759.351409] nova-conductor[52435]: selections = self._select_destinations( [ 759.351409] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 759.351409] nova-conductor[52435]: selections = self._schedule( [ 759.351409] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 759.351409] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 759.351409] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 759.351409] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 759.351409] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.351409] nova-conductor[52435]: ERROR nova.conductor.manager [ 759.363675] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 759.363908] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 759.364152] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.428708] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: acfe2649-fd7c-41e0-9cb1-8e139fb9e6bd] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 759.429872] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 759.429872] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 759.429986] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.433374] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 759.433374] nova-conductor[52435]: Traceback (most recent call last): [ 759.433374] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 759.433374] nova-conductor[52435]: return func(*args, **kwargs) [ 759.433374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 759.433374] nova-conductor[52435]: selections = self._select_destinations( [ 759.433374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 759.433374] nova-conductor[52435]: selections = self._schedule( [ 759.433374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 759.433374] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 759.433374] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 759.433374] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 759.433374] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 759.433374] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 759.437997] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-ad25ccfb-c153-48de-9a11-cfc40c830b56 tempest-ServerDiskConfigTestJSON-927379070 tempest-ServerDiskConfigTestJSON-927379070-project-member] [instance: acfe2649-fd7c-41e0-9cb1-8e139fb9e6bd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.486529] nova-conductor[52435]: Traceback (most recent call last): [ 760.486529] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.486529] nova-conductor[52435]: return func(*args, **kwargs) [ 760.486529] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.486529] nova-conductor[52435]: selections = self._select_destinations( [ 760.486529] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.486529] nova-conductor[52435]: selections = self._schedule( [ 760.486529] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.486529] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 760.486529] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.486529] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 760.486529] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.486529] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.496818] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.497109] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.497301] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.561409] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: e668518b-9b9e-4fb4-8db2-3e0197b78fb0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 760.561929] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.565142] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.565142] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.573375] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 760.573375] nova-conductor[52435]: Traceback (most recent call last): [ 760.573375] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.573375] nova-conductor[52435]: return func(*args, **kwargs) [ 760.573375] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.573375] nova-conductor[52435]: selections = self._select_destinations( [ 760.573375] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.573375] nova-conductor[52435]: selections = self._schedule( [ 760.573375] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.573375] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 760.573375] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.573375] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 760.573375] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.573375] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.573375] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-7bca7402-5bc7-4f86-b128-23509c34a6c3 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: e668518b-9b9e-4fb4-8db2-3e0197b78fb0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.824152] nova-conductor[52435]: Traceback (most recent call last): [ 760.824152] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.824152] nova-conductor[52435]: return func(*args, **kwargs) [ 760.824152] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.824152] nova-conductor[52435]: selections = self._select_destinations( [ 760.824152] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.824152] nova-conductor[52435]: selections = self._schedule( [ 760.824152] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.824152] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 760.824152] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.824152] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 760.824152] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.824152] nova-conductor[52435]: ERROR nova.conductor.manager [ 760.838634] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.838634] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.838634] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.911764] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] [instance: ce96198a-57c1-4423-8105-d5cdcd2de4cd] block_device_mapping [BlockDeviceMapping(attachment_id=47f331fa-e549-4cdf-8036-3830b97ae07f,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='36047ef0-9045-4b05-8950-3aa0736c6bc9',volume_size=1,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 760.911764] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.911764] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.911764] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.913556] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 760.913556] nova-conductor[52435]: Traceback (most recent call last): [ 760.913556] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 760.913556] nova-conductor[52435]: return func(*args, **kwargs) [ 760.913556] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 760.913556] nova-conductor[52435]: selections = self._select_destinations( [ 760.913556] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 760.913556] nova-conductor[52435]: selections = self._schedule( [ 760.913556] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 760.913556] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 760.913556] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 760.913556] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 760.913556] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 760.913556] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 760.914694] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-3877f559-7f41-4558-be4c-2a32fc6ba8ed tempest-ServersTestBootFromVolume-1821255316 tempest-ServersTestBootFromVolume-1821255316-project-member] [instance: ce96198a-57c1-4423-8105-d5cdcd2de4cd] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.321302] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 67b672b6-c6cb-4dc2-9d75-fb028195a0dd was re-scheduled: Binding failed for port 55fd5071-a52d-47ff-875a-8662e0df32fd, please check neutron logs for more information.\n'] [ 761.321849] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 761.322804] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67b672b6-c6cb-4dc2-9d75-fb028195a0dd.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67b672b6-c6cb-4dc2-9d75-fb028195a0dd. [ 761.322804] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67b672b6-c6cb-4dc2-9d75-fb028195a0dd. [ 761.351107] nova-conductor[52435]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 761.569255] nova-conductor[52435]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.572376] nova-conductor[52435]: DEBUG nova.network.neutron [None req-f2feffbd-4cf4-49c3-bb61-a8840deeb00f tempest-ServerActionsTestJSON-260710802 tempest-ServerActionsTestJSON-260710802-project-member] [instance: 67b672b6-c6cb-4dc2-9d75-fb028195a0dd] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.864867] nova-conductor[52435]: Traceback (most recent call last): [ 761.864867] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.864867] nova-conductor[52435]: return func(*args, **kwargs) [ 761.864867] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.864867] nova-conductor[52435]: selections = self._select_destinations( [ 761.864867] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.864867] nova-conductor[52435]: selections = self._schedule( [ 761.864867] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.864867] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 761.864867] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.864867] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 761.864867] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.864867] nova-conductor[52435]: ERROR nova.conductor.manager [ 761.878106] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.878106] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.878507] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.934990] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 507d89fc-2083-4575-9a9c-f7f350741ef3 was re-scheduled: Binding failed for port 3c3207c4-43fc-434a-b522-b2b074acdf74, please check neutron logs for more information.\n'] [ 761.937760] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 761.937760] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 507d89fc-2083-4575-9a9c-f7f350741ef3.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 507d89fc-2083-4575-9a9c-f7f350741ef3. [ 761.937760] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 507d89fc-2083-4575-9a9c-f7f350741ef3. [ 761.948358] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: d667e8c7-300e-44bf-93e1-e8adc8d562a4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 761.949193] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.949397] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.949556] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.954537] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 761.954537] nova-conductor[52435]: Traceback (most recent call last): [ 761.954537] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.954537] nova-conductor[52435]: return func(*args, **kwargs) [ 761.954537] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.954537] nova-conductor[52435]: selections = self._select_destinations( [ 761.954537] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.954537] nova-conductor[52435]: selections = self._schedule( [ 761.954537] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.954537] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 761.954537] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.954537] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 761.954537] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.954537] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.955581] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-c9f92269-dfe5-4076-ab5a-358a15c1fa39 tempest-AttachVolumeShelveTestJSON-1182673235 tempest-AttachVolumeShelveTestJSON-1182673235-project-member] [instance: d667e8c7-300e-44bf-93e1-e8adc8d562a4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.984083] nova-conductor[52436]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.114443] nova-conductor[52436]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.116557] nova-conductor[52436]: DEBUG nova.network.neutron [None req-81a5ffc2-9b5f-44bf-bbc6-66d5436cbb1b tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 507d89fc-2083-4575-9a9c-f7f350741ef3] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.525937] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance f3b9789c-ecc6-4b1a-96ec-2c71dba363f7 was re-scheduled: Binding failed for port 9d208dba-9a95-4330-8921-302664ac21ba, please check neutron logs for more information.\n'] [ 762.529214] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 762.530023] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f3b9789c-ecc6-4b1a-96ec-2c71dba363f7.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f3b9789c-ecc6-4b1a-96ec-2c71dba363f7. [ 762.530324] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f3b9789c-ecc6-4b1a-96ec-2c71dba363f7. [ 762.581904] nova-conductor[52435]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.612702] nova-conductor[52435]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.620310] nova-conductor[52435]: DEBUG nova.network.neutron [None req-ecd91ce7-7a4e-4855-aa37-b44768b4b5e5 tempest-ServersAdminNegativeTestJSON-505936840 tempest-ServersAdminNegativeTestJSON-505936840-project-member] [instance: f3b9789c-ecc6-4b1a-96ec-2c71dba363f7] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.987889] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 3835af93-7a47-4d3c-9296-256aadddc3b3 was re-scheduled: Binding failed for port d2fae340-0a11-4474-8e26-d713b0ec1239, please check neutron logs for more information.\n'] [ 762.989314] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 762.989407] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3835af93-7a47-4d3c-9296-256aadddc3b3.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3835af93-7a47-4d3c-9296-256aadddc3b3. [ 762.990053] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3835af93-7a47-4d3c-9296-256aadddc3b3. [ 763.013196] nova-conductor[52436]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 763.072076] nova-conductor[52436]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 763.075163] nova-conductor[52436]: DEBUG nova.network.neutron [None req-0ad111bf-e577-4fe7-b842-b72f4e356f96 tempest-TenantUsagesTestJSON-1384199689 tempest-TenantUsagesTestJSON-1384199689-project-member] [instance: 3835af93-7a47-4d3c-9296-256aadddc3b3] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 763.959524] nova-conductor[52436]: Traceback (most recent call last): [ 763.959524] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 763.959524] nova-conductor[52436]: return func(*args, **kwargs) [ 763.959524] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 763.959524] nova-conductor[52436]: selections = self._select_destinations( [ 763.959524] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 763.959524] nova-conductor[52436]: selections = self._schedule( [ 763.959524] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 763.959524] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 763.959524] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 763.959524] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 763.959524] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.959524] nova-conductor[52436]: ERROR nova.conductor.manager [ 763.971257] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.971529] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 763.973953] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.039096] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: b475ab55-63b9-452d-ad4f-7bf7797ee83e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 764.039816] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.040803] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.040803] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.044125] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 764.044125] nova-conductor[52436]: Traceback (most recent call last): [ 764.044125] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.044125] nova-conductor[52436]: return func(*args, **kwargs) [ 764.044125] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.044125] nova-conductor[52436]: selections = self._select_destinations( [ 764.044125] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.044125] nova-conductor[52436]: selections = self._schedule( [ 764.044125] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.044125] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 764.044125] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.044125] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 764.044125] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.044125] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.044125] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-02f4123d-7094-4a25-9772-d45c42fdfd76 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: b475ab55-63b9-452d-ad4f-7bf7797ee83e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.077220] nova-conductor[52435]: Traceback (most recent call last): [ 764.077220] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.077220] nova-conductor[52435]: return func(*args, **kwargs) [ 764.077220] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.077220] nova-conductor[52435]: selections = self._select_destinations( [ 764.077220] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.077220] nova-conductor[52435]: selections = self._schedule( [ 764.077220] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.077220] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 764.077220] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.077220] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 764.077220] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.077220] nova-conductor[52435]: ERROR nova.conductor.manager [ 764.089324] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.089324] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.089324] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.141300] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e35329d-fd40-4a21-85c9-048ccdc69a9d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 764.142024] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.142311] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.142478] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.145531] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 764.145531] nova-conductor[52435]: Traceback (most recent call last): [ 764.145531] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.145531] nova-conductor[52435]: return func(*args, **kwargs) [ 764.145531] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.145531] nova-conductor[52435]: selections = self._select_destinations( [ 764.145531] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.145531] nova-conductor[52435]: selections = self._schedule( [ 764.145531] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.145531] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 764.145531] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.145531] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 764.145531] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.145531] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.147019] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-abb72766-e422-4ed1-b8d6-ce7b902b4754 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: 4e35329d-fd40-4a21-85c9-048ccdc69a9d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 765.352478] nova-conductor[52435]: Traceback (most recent call last): [ 765.352478] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 765.352478] nova-conductor[52435]: return func(*args, **kwargs) [ 765.352478] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 765.352478] nova-conductor[52435]: selections = self._select_destinations( [ 765.352478] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 765.352478] nova-conductor[52435]: selections = self._schedule( [ 765.352478] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 765.352478] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 765.352478] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 765.352478] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 765.352478] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.352478] nova-conductor[52435]: ERROR nova.conductor.manager [ 765.362226] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.362451] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 765.362616] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 765.453107] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] [instance: 9da4f04d-730f-4974-8b5f-f0a2d8d35a65] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 765.453893] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.454074] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 765.454266] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 765.462549] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 765.462549] nova-conductor[52435]: Traceback (most recent call last): [ 765.462549] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 765.462549] nova-conductor[52435]: return func(*args, **kwargs) [ 765.462549] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 765.462549] nova-conductor[52435]: selections = self._select_destinations( [ 765.462549] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 765.462549] nova-conductor[52435]: selections = self._schedule( [ 765.462549] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 765.462549] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 765.462549] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 765.462549] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 765.462549] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 765.462549] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 765.463147] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-9e130f23-87ec-4be6-9063-0dbc4ece5d02 tempest-ServerTagsTestJSON-1545938283 tempest-ServerTagsTestJSON-1545938283-project-member] [instance: 9da4f04d-730f-4974-8b5f-f0a2d8d35a65] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 765.966961] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 765.967574] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 765.967823] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc. [ 765.968133] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-856a5074-044d-46a2-b268-e5d3bf702adf tempest-ServersAaction247Test-1504655183 tempest-ServersAaction247Test-1504655183-project-member] [instance: 5c575e05-5a7c-49b8-b914-9b4a4e347bfc] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5c575e05-5a7c-49b8-b914-9b4a4e347bfc. [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.561286] nova-conductor[52436]: Traceback (most recent call last): [ 766.561286] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 766.561286] nova-conductor[52436]: return func(*args, **kwargs) [ 766.561286] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 766.561286] nova-conductor[52436]: selections = self._select_destinations( [ 766.561286] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 766.561286] nova-conductor[52436]: selections = self._schedule( [ 766.561286] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 766.561286] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 766.561286] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 766.561286] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 766.561286] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.561286] nova-conductor[52436]: ERROR nova.conductor.manager [ 766.570343] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.570578] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.570743] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.619384] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: c102ba14-4e5d-409a-8c15-003a30290638] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 766.620332] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.620551] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.620716] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.626932] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 766.626932] nova-conductor[52436]: Traceback (most recent call last): [ 766.626932] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 766.626932] nova-conductor[52436]: return func(*args, **kwargs) [ 766.626932] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 766.626932] nova-conductor[52436]: selections = self._select_destinations( [ 766.626932] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 766.626932] nova-conductor[52436]: selections = self._schedule( [ 766.626932] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 766.626932] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 766.626932] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 766.626932] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 766.626932] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 766.626932] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.627438] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: c102ba14-4e5d-409a-8c15-003a30290638] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 766.651604] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.651821] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.652153] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b1bf82b5-ad3b-4787-ab76-0773949b23ee tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.212324] nova-conductor[52435]: Traceback (most recent call last): [ 767.212324] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.212324] nova-conductor[52435]: return func(*args, **kwargs) [ 767.212324] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.212324] nova-conductor[52435]: selections = self._select_destinations( [ 767.212324] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.212324] nova-conductor[52435]: selections = self._schedule( [ 767.212324] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.212324] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 767.212324] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.212324] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 767.212324] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.212324] nova-conductor[52435]: ERROR nova.conductor.manager [ 767.226970] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.227219] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 767.227675] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.276131] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: 474d025d-7a80-458d-ae4b-8ae16d6a9fff] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 767.276131] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.276131] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 767.276131] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.281171] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 767.281171] nova-conductor[52435]: Traceback (most recent call last): [ 767.281171] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.281171] nova-conductor[52435]: return func(*args, **kwargs) [ 767.281171] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.281171] nova-conductor[52435]: selections = self._select_destinations( [ 767.281171] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.281171] nova-conductor[52435]: selections = self._schedule( [ 767.281171] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.281171] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 767.281171] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.281171] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 767.281171] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.281171] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.281171] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-1023dba8-6047-4e73-9d20-986e106b8264 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: 474d025d-7a80-458d-ae4b-8ae16d6a9fff] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.909829] nova-conductor[52436]: Traceback (most recent call last): [ 767.909829] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.909829] nova-conductor[52436]: return func(*args, **kwargs) [ 767.909829] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.909829] nova-conductor[52436]: selections = self._select_destinations( [ 767.909829] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.909829] nova-conductor[52436]: selections = self._schedule( [ 767.909829] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.909829] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 767.909829] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.909829] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 767.909829] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.909829] nova-conductor[52436]: ERROR nova.conductor.manager [ 767.922784] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.922784] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 767.922784] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.972075] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: ce9ebfd1-ebbb-4d68-af21-c2d8d39893e0] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 767.972760] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.972963] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 767.973144] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.976220] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 767.976220] nova-conductor[52436]: Traceback (most recent call last): [ 767.976220] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 767.976220] nova-conductor[52436]: return func(*args, **kwargs) [ 767.976220] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 767.976220] nova-conductor[52436]: selections = self._select_destinations( [ 767.976220] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 767.976220] nova-conductor[52436]: selections = self._schedule( [ 767.976220] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 767.976220] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 767.976220] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 767.976220] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 767.976220] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 767.976220] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 767.976756] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6161f1a7-542c-4e3d-96bf-91e2389fcf00 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: ce9ebfd1-ebbb-4d68-af21-c2d8d39893e0] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 768.905649] nova-conductor[52435]: Traceback (most recent call last): [ 768.905649] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 768.905649] nova-conductor[52435]: return func(*args, **kwargs) [ 768.905649] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 768.905649] nova-conductor[52435]: selections = self._select_destinations( [ 768.905649] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 768.905649] nova-conductor[52435]: selections = self._schedule( [ 768.905649] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 768.905649] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 768.905649] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 768.905649] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 768.905649] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.905649] nova-conductor[52435]: ERROR nova.conductor.manager [ 768.913823] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 768.913823] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 768.913823] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 768.957394] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: 92161e7e-0d3e-4041-9194-0d2abe17fe22] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 768.958250] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 768.958458] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 768.958678] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 768.961744] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 768.961744] nova-conductor[52435]: Traceback (most recent call last): [ 768.961744] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 768.961744] nova-conductor[52435]: return func(*args, **kwargs) [ 768.961744] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 768.961744] nova-conductor[52435]: selections = self._select_destinations( [ 768.961744] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 768.961744] nova-conductor[52435]: selections = self._schedule( [ 768.961744] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 768.961744] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 768.961744] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 768.961744] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 768.961744] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 768.961744] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 768.962285] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-28c1d084-4924-4b54-8c63-35fc1a715240 tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: 92161e7e-0d3e-4041-9194-0d2abe17fe22] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 770.421891] nova-conductor[52436]: Traceback (most recent call last): [ 770.421891] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 770.421891] nova-conductor[52436]: return func(*args, **kwargs) [ 770.421891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 770.421891] nova-conductor[52436]: selections = self._select_destinations( [ 770.421891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 770.421891] nova-conductor[52436]: selections = self._schedule( [ 770.421891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 770.421891] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 770.421891] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 770.421891] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 770.421891] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.421891] nova-conductor[52436]: ERROR nova.conductor.manager [ 770.433349] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 770.433349] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 770.433606] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 770.481564] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: 818cd3e9-8ba6-46e1-8450-3513e444f7d7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 770.482157] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 770.482385] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 770.482551] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 770.485887] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 770.485887] nova-conductor[52436]: Traceback (most recent call last): [ 770.485887] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 770.485887] nova-conductor[52436]: return func(*args, **kwargs) [ 770.485887] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 770.485887] nova-conductor[52436]: selections = self._select_destinations( [ 770.485887] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 770.485887] nova-conductor[52436]: selections = self._schedule( [ 770.485887] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 770.485887] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 770.485887] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 770.485887] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 770.485887] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 770.485887] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 770.486602] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-bdff6f9b-c4d0-4dbc-8a2e-78fe3e75ab53 tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: 818cd3e9-8ba6-46e1-8450-3513e444f7d7] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 770.972851] nova-conductor[52435]: Traceback (most recent call last): [ 770.972851] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 770.972851] nova-conductor[52435]: return func(*args, **kwargs) [ 770.972851] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 770.972851] nova-conductor[52435]: selections = self._select_destinations( [ 770.972851] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 770.972851] nova-conductor[52435]: selections = self._schedule( [ 770.972851] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 770.972851] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 770.972851] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 770.972851] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 770.972851] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.972851] nova-conductor[52435]: ERROR nova.conductor.manager [ 770.985227] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 770.985464] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 770.985633] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 771.037392] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: f4bdf956-060f-48ba-a4e7-95120d879dd9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 771.037483] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 771.037704] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 771.038112] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 771.042462] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 771.042462] nova-conductor[52435]: Traceback (most recent call last): [ 771.042462] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 771.042462] nova-conductor[52435]: return func(*args, **kwargs) [ 771.042462] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 771.042462] nova-conductor[52435]: selections = self._select_destinations( [ 771.042462] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 771.042462] nova-conductor[52435]: selections = self._schedule( [ 771.042462] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 771.042462] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 771.042462] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 771.042462] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 771.042462] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 771.042462] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 771.043327] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5fea3243-a26e-4a8b-97d8-8137313a5644 tempest-DeleteServersTestJSON-1609465879 tempest-DeleteServersTestJSON-1609465879-project-member] [instance: f4bdf956-060f-48ba-a4e7-95120d879dd9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 772.405589] nova-conductor[52436]: Traceback (most recent call last): [ 772.405589] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 772.405589] nova-conductor[52436]: return func(*args, **kwargs) [ 772.405589] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 772.405589] nova-conductor[52436]: selections = self._select_destinations( [ 772.405589] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 772.405589] nova-conductor[52436]: selections = self._schedule( [ 772.405589] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 772.405589] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 772.405589] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 772.405589] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 772.405589] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.405589] nova-conductor[52436]: ERROR nova.conductor.manager [ 772.413832] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 772.414154] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 772.414382] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 772.474727] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: 94365c52-b51b-47c0-a45d-e9239537b462] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 772.476432] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 772.476432] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 772.476432] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 772.479503] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 772.479503] nova-conductor[52436]: Traceback (most recent call last): [ 772.479503] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 772.479503] nova-conductor[52436]: return func(*args, **kwargs) [ 772.479503] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 772.479503] nova-conductor[52436]: selections = self._select_destinations( [ 772.479503] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 772.479503] nova-conductor[52436]: selections = self._schedule( [ 772.479503] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 772.479503] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 772.479503] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 772.479503] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 772.479503] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 772.479503] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 772.480018] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-80f3d131-4a42-4a14-9c8b-86e47a72396f tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: 94365c52-b51b-47c0-a45d-e9239537b462] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.130097] nova-conductor[52435]: Traceback (most recent call last): [ 773.130097] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.130097] nova-conductor[52435]: return func(*args, **kwargs) [ 773.130097] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.130097] nova-conductor[52435]: selections = self._select_destinations( [ 773.130097] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.130097] nova-conductor[52435]: selections = self._schedule( [ 773.130097] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.130097] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 773.130097] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.130097] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 773.130097] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.130097] nova-conductor[52435]: ERROR nova.conductor.manager [ 773.146781] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.147095] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.147644] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.211537] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: c17b4bf5-b954-410e-b2e6-0d348b0e3953] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 773.212268] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.212470] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.212855] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.216592] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 773.216592] nova-conductor[52435]: Traceback (most recent call last): [ 773.216592] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.216592] nova-conductor[52435]: return func(*args, **kwargs) [ 773.216592] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.216592] nova-conductor[52435]: selections = self._select_destinations( [ 773.216592] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.216592] nova-conductor[52435]: selections = self._schedule( [ 773.216592] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.216592] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 773.216592] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.216592] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 773.216592] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.216592] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.217153] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-bbd195fa-4b75-45ae-8488-5a968d26f87a tempest-ImagesTestJSON-939914451 tempest-ImagesTestJSON-939914451-project-member] [instance: c17b4bf5-b954-410e-b2e6-0d348b0e3953] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 774.684978] nova-conductor[52436]: Traceback (most recent call last): [ 774.684978] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 774.684978] nova-conductor[52436]: return func(*args, **kwargs) [ 774.684978] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 774.684978] nova-conductor[52436]: selections = self._select_destinations( [ 774.684978] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 774.684978] nova-conductor[52436]: selections = self._schedule( [ 774.684978] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 774.684978] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 774.684978] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 774.684978] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 774.684978] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.684978] nova-conductor[52436]: ERROR nova.conductor.manager [ 774.691919] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 774.692012] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 774.692183] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 774.755527] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: a1a7b595-20fd-43d8-bff2-286cecd1a83e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 774.756299] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 774.756572] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 774.756851] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 774.763276] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 774.763276] nova-conductor[52436]: Traceback (most recent call last): [ 774.763276] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 774.763276] nova-conductor[52436]: return func(*args, **kwargs) [ 774.763276] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 774.763276] nova-conductor[52436]: selections = self._select_destinations( [ 774.763276] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 774.763276] nova-conductor[52436]: selections = self._schedule( [ 774.763276] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 774.763276] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 774.763276] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 774.763276] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 774.763276] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 774.763276] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 774.765037] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-6aa78579-554e-4c0e-97a5-7017c80c244c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: a1a7b595-20fd-43d8-bff2-286cecd1a83e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.502404] nova-conductor[52435]: Traceback (most recent call last): [ 775.502404] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.502404] nova-conductor[52435]: return func(*args, **kwargs) [ 775.502404] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.502404] nova-conductor[52435]: selections = self._select_destinations( [ 775.502404] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.502404] nova-conductor[52435]: selections = self._schedule( [ 775.502404] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.502404] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 775.502404] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.502404] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 775.502404] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.502404] nova-conductor[52435]: ERROR nova.conductor.manager [ 775.515375] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.515759] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.516066] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.579806] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: 59c2fc7d-69ef-470a-890b-f3b6a34be63e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 775.580606] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.580823] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.580993] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.585219] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 775.585219] nova-conductor[52435]: Traceback (most recent call last): [ 775.585219] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.585219] nova-conductor[52435]: return func(*args, **kwargs) [ 775.585219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.585219] nova-conductor[52435]: selections = self._select_destinations( [ 775.585219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.585219] nova-conductor[52435]: selections = self._schedule( [ 775.585219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.585219] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 775.585219] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.585219] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 775.585219] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.585219] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.585820] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: 59c2fc7d-69ef-470a-890b-f3b6a34be63e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.616113] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.616839] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.617083] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.677734] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: 6f18732c-44b2-46af-ad1e-b6c88c8c964e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 775.677734] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.677734] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.678131] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.681306] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 775.681306] nova-conductor[52435]: Traceback (most recent call last): [ 775.681306] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.681306] nova-conductor[52435]: return func(*args, **kwargs) [ 775.681306] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.681306] nova-conductor[52435]: selections = self._select_destinations( [ 775.681306] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.681306] nova-conductor[52435]: selections = self._schedule( [ 775.681306] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.681306] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 775.681306] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.681306] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 775.681306] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.681306] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.681822] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-dcb95ac3-8864-4982-9473-5af261f5f61a tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: 6f18732c-44b2-46af-ad1e-b6c88c8c964e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.321372] nova-conductor[52436]: Traceback (most recent call last): [ 776.321372] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.321372] nova-conductor[52436]: return func(*args, **kwargs) [ 776.321372] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.321372] nova-conductor[52436]: selections = self._select_destinations( [ 776.321372] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.321372] nova-conductor[52436]: selections = self._schedule( [ 776.321372] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.321372] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 776.321372] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.321372] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 776.321372] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.321372] nova-conductor[52436]: ERROR nova.conductor.manager [ 776.329181] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.329432] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.329561] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.555927] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] [instance: 6c957aea-91da-4b60-9903-17427c14ab47] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 776.555927] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.555927] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.555927] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.558658] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 776.558658] nova-conductor[52436]: Traceback (most recent call last): [ 776.558658] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.558658] nova-conductor[52436]: return func(*args, **kwargs) [ 776.558658] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.558658] nova-conductor[52436]: selections = self._select_destinations( [ 776.558658] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.558658] nova-conductor[52436]: selections = self._schedule( [ 776.558658] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.558658] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 776.558658] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.558658] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 776.558658] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.558658] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.559170] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-ae75a83b-a5f6-47db-874a-5ee589dcf3d7 tempest-ServerAddressesTestJSON-1433732937 tempest-ServerAddressesTestJSON-1433732937-project-member] [instance: 6c957aea-91da-4b60-9903-17427c14ab47] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.664103] nova-conductor[52435]: Traceback (most recent call last): [ 776.664103] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.664103] nova-conductor[52435]: return func(*args, **kwargs) [ 776.664103] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.664103] nova-conductor[52435]: selections = self._select_destinations( [ 776.664103] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.664103] nova-conductor[52435]: selections = self._schedule( [ 776.664103] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.664103] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 776.664103] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.664103] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 776.664103] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.664103] nova-conductor[52435]: ERROR nova.conductor.manager [ 776.671990] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.672227] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.672393] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.732300] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: 6277a04d-f0df-4d1a-bf41-a181ce69aa84] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 776.733034] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.733259] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.733466] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 776.741104] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 776.741104] nova-conductor[52435]: Traceback (most recent call last): [ 776.741104] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 776.741104] nova-conductor[52435]: return func(*args, **kwargs) [ 776.741104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 776.741104] nova-conductor[52435]: selections = self._select_destinations( [ 776.741104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 776.741104] nova-conductor[52435]: selections = self._schedule( [ 776.741104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 776.741104] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 776.741104] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 776.741104] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 776.741104] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 776.741104] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 776.744325] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-a0e98da1-83d5-4d1c-b47d-63d0611b765c tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: 6277a04d-f0df-4d1a-bf41-a181ce69aa84] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.167215] nova-conductor[52436]: Traceback (most recent call last): [ 778.167215] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.167215] nova-conductor[52436]: return func(*args, **kwargs) [ 778.167215] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.167215] nova-conductor[52436]: selections = self._select_destinations( [ 778.167215] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.167215] nova-conductor[52436]: selections = self._schedule( [ 778.167215] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.167215] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 778.167215] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.167215] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 778.167215] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.167215] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.175162] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.175395] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.175559] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.226765] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: 8a28beac-40e2-4384-a3fb-7a01a1d54af4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 778.227532] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.227780] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.227911] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.231961] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 778.231961] nova-conductor[52436]: Traceback (most recent call last): [ 778.231961] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.231961] nova-conductor[52436]: return func(*args, **kwargs) [ 778.231961] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.231961] nova-conductor[52436]: selections = self._select_destinations( [ 778.231961] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.231961] nova-conductor[52436]: selections = self._schedule( [ 778.231961] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.231961] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 778.231961] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.231961] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 778.231961] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.231961] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.231961] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: 8a28beac-40e2-4384-a3fb-7a01a1d54af4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.255861] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.256510] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.256721] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.299238] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: b290344b-0ad7-47d3-9aa0-a6467ae85397] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 778.300008] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.300238] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.300408] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.304247] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 778.304247] nova-conductor[52436]: Traceback (most recent call last): [ 778.304247] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.304247] nova-conductor[52436]: return func(*args, **kwargs) [ 778.304247] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.304247] nova-conductor[52436]: selections = self._select_destinations( [ 778.304247] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.304247] nova-conductor[52436]: selections = self._schedule( [ 778.304247] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.304247] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 778.304247] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.304247] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 778.304247] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.304247] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.304936] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9a3ab43e-0af9-40de-8c7e-a6330e91050b tempest-MultipleCreateTestJSON-285117607 tempest-MultipleCreateTestJSON-285117607-project-member] [instance: b290344b-0ad7-47d3-9aa0-a6467ae85397] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.769145] nova-conductor[52435]: Traceback (most recent call last): [ 778.769145] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.769145] nova-conductor[52435]: return func(*args, **kwargs) [ 778.769145] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.769145] nova-conductor[52435]: selections = self._select_destinations( [ 778.769145] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.769145] nova-conductor[52435]: selections = self._schedule( [ 778.769145] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.769145] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 778.769145] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.769145] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 778.769145] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.769145] nova-conductor[52435]: ERROR nova.conductor.manager [ 778.783861] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.784088] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.784558] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.806037] nova-conductor[52436]: Traceback (most recent call last): [ 778.806037] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.806037] nova-conductor[52436]: return func(*args, **kwargs) [ 778.806037] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.806037] nova-conductor[52436]: selections = self._select_destinations( [ 778.806037] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.806037] nova-conductor[52436]: selections = self._schedule( [ 778.806037] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.806037] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 778.806037] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.806037] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 778.806037] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.806037] nova-conductor[52436]: ERROR nova.conductor.manager [ 778.816453] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.816682] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.816924] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.845027] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: abe85928-f968-4b2b-9f28-fb7b8344d0a5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 778.847350] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.847350] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.847350] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.849762] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 778.849762] nova-conductor[52435]: Traceback (most recent call last): [ 778.849762] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.849762] nova-conductor[52435]: return func(*args, **kwargs) [ 778.849762] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.849762] nova-conductor[52435]: selections = self._select_destinations( [ 778.849762] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.849762] nova-conductor[52435]: selections = self._schedule( [ 778.849762] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.849762] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 778.849762] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.849762] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 778.849762] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.849762] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.851066] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-8fcb91b7-ee58-4909-9bb8-7cfee0000c4a tempest-ServersTestJSON-855635163 tempest-ServersTestJSON-855635163-project-member] [instance: abe85928-f968-4b2b-9f28-fb7b8344d0a5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.878138] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] [instance: c0697d6a-a483-4c9f-a7c5-410a1bfe58c8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 778.878870] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.879098] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.879301] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 778.882700] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 778.882700] nova-conductor[52436]: Traceback (most recent call last): [ 778.882700] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 778.882700] nova-conductor[52436]: return func(*args, **kwargs) [ 778.882700] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 778.882700] nova-conductor[52436]: selections = self._select_destinations( [ 778.882700] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 778.882700] nova-conductor[52436]: selections = self._schedule( [ 778.882700] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 778.882700] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 778.882700] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 778.882700] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 778.882700] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 778.882700] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 778.883256] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-f0028133-1f87-4714-a85b-92702cda1420 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] [instance: c0697d6a-a483-4c9f-a7c5-410a1bfe58c8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.328343] nova-conductor[52435]: Traceback (most recent call last): [ 779.328343] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.328343] nova-conductor[52435]: return func(*args, **kwargs) [ 779.328343] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.328343] nova-conductor[52435]: selections = self._select_destinations( [ 779.328343] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.328343] nova-conductor[52435]: selections = self._schedule( [ 779.328343] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.328343] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 779.328343] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.328343] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 779.328343] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.328343] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.338490] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.338718] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.338889] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.375810] nova-conductor[52436]: Traceback (most recent call last): [ 779.375810] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.375810] nova-conductor[52436]: return func(*args, **kwargs) [ 779.375810] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.375810] nova-conductor[52436]: selections = self._select_destinations( [ 779.375810] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.375810] nova-conductor[52436]: selections = self._schedule( [ 779.375810] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.375810] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 779.375810] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.375810] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 779.375810] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.375810] nova-conductor[52436]: ERROR nova.conductor.manager [ 779.388673] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.388840] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.389022] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.395378] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] [instance: c8f5f4e9-a4d9-4849-9051-c26c00cc5b9a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 779.396113] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.396353] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.396519] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.400017] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 779.400017] nova-conductor[52435]: Traceback (most recent call last): [ 779.400017] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.400017] nova-conductor[52435]: return func(*args, **kwargs) [ 779.400017] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.400017] nova-conductor[52435]: selections = self._select_destinations( [ 779.400017] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.400017] nova-conductor[52435]: selections = self._schedule( [ 779.400017] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.400017] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 779.400017] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.400017] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 779.400017] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.400017] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.400585] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-507349e2-512d-4db4-878e-7b83ee990d81 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] [instance: c8f5f4e9-a4d9-4849-9051-c26c00cc5b9a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.457263] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] [instance: 4d9c5b48-3113-406f-ab15-1ca6d7cf408d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 779.458940] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.459277] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.459533] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.466187] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 779.466187] nova-conductor[52436]: Traceback (most recent call last): [ 779.466187] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.466187] nova-conductor[52436]: return func(*args, **kwargs) [ 779.466187] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.466187] nova-conductor[52436]: selections = self._select_destinations( [ 779.466187] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.466187] nova-conductor[52436]: selections = self._schedule( [ 779.466187] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.466187] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 779.466187] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.466187] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 779.466187] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.466187] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.466187] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b21b7d60-06a4-433e-88b6-ffb8ddfb5da6 tempest-ServersV294TestFqdnHostnames-786105790 tempest-ServersV294TestFqdnHostnames-786105790-project-member] [instance: 4d9c5b48-3113-406f-ab15-1ca6d7cf408d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.740844] nova-conductor[52435]: Traceback (most recent call last): [ 779.740844] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.740844] nova-conductor[52435]: return func(*args, **kwargs) [ 779.740844] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.740844] nova-conductor[52435]: selections = self._select_destinations( [ 779.740844] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.740844] nova-conductor[52435]: selections = self._schedule( [ 779.740844] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.740844] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 779.740844] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.740844] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 779.740844] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.740844] nova-conductor[52435]: ERROR nova.conductor.manager [ 779.750010] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.750278] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.750449] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.806165] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] [instance: a1fb0eab-4052-430a-b069-23e400279698] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 779.807117] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.807431] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.808851] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.811742] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 779.811742] nova-conductor[52435]: Traceback (most recent call last): [ 779.811742] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.811742] nova-conductor[52435]: return func(*args, **kwargs) [ 779.811742] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.811742] nova-conductor[52435]: selections = self._select_destinations( [ 779.811742] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.811742] nova-conductor[52435]: selections = self._schedule( [ 779.811742] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.811742] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 779.811742] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.811742] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 779.811742] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.811742] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.814343] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-3dcd2b45-5821-4196-8e41-95a8a2c02aa3 tempest-ListServerFiltersTestJSON-1383070416 tempest-ListServerFiltersTestJSON-1383070416-project-member] [instance: a1fb0eab-4052-430a-b069-23e400279698] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 785.390120] nova-conductor[52436]: Traceback (most recent call last): [ 785.390120] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 785.390120] nova-conductor[52436]: return func(*args, **kwargs) [ 785.390120] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 785.390120] nova-conductor[52436]: selections = self._select_destinations( [ 785.390120] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 785.390120] nova-conductor[52436]: selections = self._schedule( [ 785.390120] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 785.390120] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 785.390120] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 785.390120] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 785.390120] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.390120] nova-conductor[52436]: ERROR nova.conductor.manager [ 785.398914] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 785.399265] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 785.399504] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 785.453457] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] [instance: 39d41b8b-bbaa-4712-a7fd-5105051795c3] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 785.454433] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 785.454769] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 785.455069] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 785.459478] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 785.459478] nova-conductor[52436]: Traceback (most recent call last): [ 785.459478] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 785.459478] nova-conductor[52436]: return func(*args, **kwargs) [ 785.459478] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 785.459478] nova-conductor[52436]: selections = self._select_destinations( [ 785.459478] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 785.459478] nova-conductor[52436]: selections = self._schedule( [ 785.459478] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 785.459478] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 785.459478] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 785.459478] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 785.459478] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 785.459478] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 785.460298] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-fa89575e-2216-408b-82bd-01138ca2f3d6 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] [instance: 39d41b8b-bbaa-4712-a7fd-5105051795c3] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.134034] nova-conductor[52435]: Traceback (most recent call last): [ 788.134034] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.134034] nova-conductor[52435]: return func(*args, **kwargs) [ 788.134034] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.134034] nova-conductor[52435]: selections = self._select_destinations( [ 788.134034] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.134034] nova-conductor[52435]: selections = self._schedule( [ 788.134034] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.134034] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 788.134034] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.134034] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 788.134034] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager result = self.transport._send( [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager raise result [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager Traceback (most recent call last): [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._select_destinations( [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager selections = self._schedule( [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.134034] nova-conductor[52435]: ERROR nova.conductor.manager [ 788.140352] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.140572] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.140745] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.176950] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] [instance: 74099d79-f2be-4790-b247-4e8028f61633] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52435) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 788.177623] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.177845] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.178016] nova-conductor[52435]: DEBUG oslo_concurrency.lockutils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52435) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.182421] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 788.182421] nova-conductor[52435]: Traceback (most recent call last): [ 788.182421] nova-conductor[52435]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 788.182421] nova-conductor[52435]: return func(*args, **kwargs) [ 788.182421] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 788.182421] nova-conductor[52435]: selections = self._select_destinations( [ 788.182421] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 788.182421] nova-conductor[52435]: selections = self._schedule( [ 788.182421] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 788.182421] nova-conductor[52435]: self._ensure_sufficient_hosts( [ 788.182421] nova-conductor[52435]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 788.182421] nova-conductor[52435]: raise exception.NoValidHost(reason=reason) [ 788.182421] nova-conductor[52435]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 788.182421] nova-conductor[52435]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 788.183083] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5adaf985-c005-4d8b-aa86-bae8048ba6d7 tempest-AttachVolumeTestJSON-1073652285 tempest-AttachVolumeTestJSON-1073652285-project-member] [instance: 74099d79-f2be-4790-b247-4e8028f61633] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.717063] nova-conductor[52436]: Traceback (most recent call last): [ 792.717063] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.717063] nova-conductor[52436]: return func(*args, **kwargs) [ 792.717063] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.717063] nova-conductor[52436]: selections = self._select_destinations( [ 792.717063] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.717063] nova-conductor[52436]: selections = self._schedule( [ 792.717063] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.717063] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 792.717063] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.717063] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 792.717063] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.717063] nova-conductor[52436]: ERROR nova.conductor.manager [ 792.725851] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.726126] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.726327] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.766824] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] [instance: e9c006ea-e917-4f88-b332-324e32c00839] block_device_mapping [BlockDeviceMapping(attachment_id=1e8e1d7b-acd0-43e1-a205-f7b4300b069c,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='33ef452a-0d19-4572-9a02-ef89cb3a63f7',volume_size=1,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 792.767536] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.767740] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.767905] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 792.770595] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 792.770595] nova-conductor[52436]: Traceback (most recent call last): [ 792.770595] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 792.770595] nova-conductor[52436]: return func(*args, **kwargs) [ 792.770595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 792.770595] nova-conductor[52436]: selections = self._select_destinations( [ 792.770595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 792.770595] nova-conductor[52436]: selections = self._schedule( [ 792.770595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 792.770595] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 792.770595] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 792.770595] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 792.770595] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 792.770595] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 792.771088] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-9e578bd6-479a-4d09-bda0-ac16ea7b2eb9 tempest-ServerActionsV293TestJSON-1433800374 tempest-ServerActionsV293TestJSON-1433800374-project-member] [instance: e9c006ea-e917-4f88-b332-324e32c00839] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 816.305575] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 11f468ba-a807-4490-9dd5-58eaad007865 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 816.305575] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 816.305997] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 11f468ba-a807-4490-9dd5-58eaad007865.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 11f468ba-a807-4490-9dd5-58eaad007865. [ 816.305997] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-5ceb329e-e806-49d1-8534-99e1585182e9 tempest-ServersAdmin275Test-194307288 tempest-ServersAdmin275Test-194307288-project-member] [instance: 11f468ba-a807-4490-9dd5-58eaad007865] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 11f468ba-a807-4490-9dd5-58eaad007865. [ 820.291877] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 820.303148] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.303375] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.303564] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.333911] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.334134] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.334304] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.334646] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.334825] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.334988] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.342964] nova-conductor[52436]: DEBUG nova.quota [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Getting quotas for project 599a0d1dc613482c952b7bc2c9024674. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 820.345305] nova-conductor[52436]: DEBUG nova.quota [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Getting quotas for user 9186471c7ae84396a91e6e3f929aded5 and project 599a0d1dc613482c952b7bc2c9024674. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 820.350934] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 820.351375] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.351570] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.351732] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.354547] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 820.355194] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.355393] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.355623] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 820.369180] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.369389] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.369557] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 823.317465] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance e0c34b27-f25b-48b8-8ace-0bbe5380a336 was re-scheduled: Binding failed for port 629678d3-be29-4351-920d-9cd5c5600081, please check neutron logs for more information.\n'] [ 823.318147] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 823.318453] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e0c34b27-f25b-48b8-8ace-0bbe5380a336.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e0c34b27-f25b-48b8-8ace-0bbe5380a336. [ 823.318823] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e0c34b27-f25b-48b8-8ace-0bbe5380a336. [ 823.337147] nova-conductor[52436]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 823.353198] nova-conductor[52436]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 823.356171] nova-conductor[52436]: DEBUG nova.network.neutron [None req-27f3a826-1482-4da4-b09f-2c822483cff2 tempest-ImagesOneServerNegativeTestJSON-1831759335 tempest-ImagesOneServerNegativeTestJSON-1831759335-project-member] [instance: e0c34b27-f25b-48b8-8ace-0bbe5380a336] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 827.840151] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 827.852359] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.852584] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.852754] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.878457] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.878671] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.878836] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.879202] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.879382] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.879538] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.887920] nova-conductor[52436]: DEBUG nova.quota [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Getting quotas for project 2b0a5316a9ae491c985d1f21a2ae77e1. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 827.890215] nova-conductor[52436]: DEBUG nova.quota [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Getting quotas for user 233f69ec97334ba9894ae70d7941be35 and project 2b0a5316a9ae491c985d1f21a2ae77e1. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 827.896013] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 827.896408] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.896599] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.896759] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.899557] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 827.900178] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.900382] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.900545] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.914547] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.914766] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.914934] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 830.120602] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 4e0763e1-7348-472a-8838-648991382724 was re-scheduled: Binding failed for port e71f4986-317b-4816-a3bd-ef4e509893a2, please check neutron logs for more information.\n'] [ 830.121185] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 830.121409] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4e0763e1-7348-472a-8838-648991382724.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4e0763e1-7348-472a-8838-648991382724. [ 830.121904] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4e0763e1-7348-472a-8838-648991382724. [ 830.139937] nova-conductor[52436]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 830.156330] nova-conductor[52436]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 830.159406] nova-conductor[52436]: DEBUG nova.network.neutron [None req-825429ae-cfdc-4552-b852-fd153618f151 tempest-InstanceActionsTestJSON-1277579837 tempest-InstanceActionsTestJSON-1277579837-project-member] [instance: 4e0763e1-7348-472a-8838-648991382724] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 864.253312] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 26aa196e-e745-494d-814f-7da3cf18ec14 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 864.253852] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 864.254095] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 26aa196e-e745-494d-814f-7da3cf18ec14.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 26aa196e-e745-494d-814f-7da3cf18ec14. [ 864.254316] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-3fa7d77c-06a7-4676-968d-f39c83e8a7bb tempest-ServerShowV257Test-1815500896 tempest-ServerShowV257Test-1815500896-project-member] [instance: 26aa196e-e745-494d-814f-7da3cf18ec14] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 26aa196e-e745-494d-814f-7da3cf18ec14. [ 868.628446] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 868.643958] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.644200] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.644367] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.675972] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.676210] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.676375] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.676713] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.676895] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.677070] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.686513] nova-conductor[52436]: DEBUG nova.quota [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Getting quotas for project 1aa856aa79e94290acfdb44c20d4a028. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 868.689594] nova-conductor[52436]: DEBUG nova.quota [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Getting quotas for user 6ba985c594674d8ab57f66762d73fe52 and project 1aa856aa79e94290acfdb44c20d4a028. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 868.695483] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 868.695944] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.696163] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.696328] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.699881] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 868.700558] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.700758] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.700918] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.713161] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.713361] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.713525] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 871.033063] nova-conductor[52435]: ERROR nova.scheduler.utils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 4d789e06-d563-46a0-80fc-0040ce074bff was re-scheduled: Binding failed for port 36f52190-23be-485b-91d7-1066f2ed40cf, please check neutron logs for more information.\n'] [ 871.033544] nova-conductor[52435]: DEBUG nova.conductor.manager [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Rescheduling: True {{(pid=52435) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 871.033773] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4d789e06-d563-46a0-80fc-0040ce074bff.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4d789e06-d563-46a0-80fc-0040ce074bff. [ 871.033980] nova-conductor[52435]: WARNING nova.scheduler.utils [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 4d789e06-d563-46a0-80fc-0040ce074bff. [ 871.054777] nova-conductor[52435]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] deallocate_for_instance() {{(pid=52435) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 871.072974] nova-conductor[52435]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Instance cache missing network info. {{(pid=52435) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 871.075825] nova-conductor[52435]: DEBUG nova.network.neutron [None req-5bbfd84e-652d-4116-947c-a9bb40e07d62 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 4d789e06-d563-46a0-80fc-0040ce074bff] Updating instance_info_cache with network_info: [] {{(pid=52435) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 873.178306] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Took 0.11 seconds to select destinations for 1 instance(s). {{(pid=52436) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 873.189271] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.189480] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.189643] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.212478] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.212688] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.212858] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.213210] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.213392] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.213549] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.221330] nova-conductor[52436]: DEBUG nova.quota [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Getting quotas for project 1aa856aa79e94290acfdb44c20d4a028. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 873.223501] nova-conductor[52436]: DEBUG nova.quota [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Getting quotas for user 6ba985c594674d8ab57f66762d73fe52 and project 1aa856aa79e94290acfdb44c20d4a028. Resources: {'instances', 'ram', 'cores'} {{(pid=52436) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 873.228726] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52436) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 873.229235] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.229425] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.229586] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.232125] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 873.232709] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.232900] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.233077] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.244281] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.244479] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.244645] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "b3c1ceaa-bc48-48ee-8b60-930585e76a41" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 875.541582] nova-conductor[52436]: ERROR nova.scheduler.utils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165 was re-scheduled: Binding failed for port 63e41031-ef05-42ad-984e-0c452e5e8238, please check neutron logs for more information.\n'] [ 875.542275] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Rescheduling: True {{(pid=52436) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 875.542509] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165. [ 875.542813] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165. [ 875.560493] nova-conductor[52436]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] deallocate_for_instance() {{(pid=52436) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 875.578968] nova-conductor[52436]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Instance cache missing network info. {{(pid=52436) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 875.581980] nova-conductor[52436]: DEBUG nova.network.neutron [None req-3552d8a9-9963-4869-9fb4-73a73b5816c4 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 305bd6dd-1e6c-4f5b-b6c2-1f5dd6a24165] Updating instance_info_cache with network_info: [] {{(pid=52436) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 877.966245] nova-conductor[52436]: Traceback (most recent call last): [ 877.966245] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 877.966245] nova-conductor[52436]: return func(*args, **kwargs) [ 877.966245] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 877.966245] nova-conductor[52436]: selections = self._select_destinations( [ 877.966245] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 877.966245] nova-conductor[52436]: selections = self._schedule( [ 877.966245] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 877.966245] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 877.966245] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 877.966245] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 877.966245] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager result = self.transport._send( [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager raise result [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager Traceback (most recent call last): [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._select_destinations( [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager selections = self._schedule( [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.966245] nova-conductor[52436]: ERROR nova.conductor.manager [ 877.972804] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 877.973047] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 877.973224] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.008355] nova-conductor[52436]: DEBUG nova.conductor.manager [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 59354370-8fc9-4933-ba88-f2886434eb5c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='4a4a4830-1ff7-4cff-ab75-d665942f46b5',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52436) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 878.009024] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.009236] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.009402] nova-conductor[52436]: DEBUG oslo_concurrency.lockutils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52436) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.013759] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 878.013759] nova-conductor[52436]: Traceback (most recent call last): [ 878.013759] nova-conductor[52436]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 878.013759] nova-conductor[52436]: return func(*args, **kwargs) [ 878.013759] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 878.013759] nova-conductor[52436]: selections = self._select_destinations( [ 878.013759] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 878.013759] nova-conductor[52436]: selections = self._schedule( [ 878.013759] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 878.013759] nova-conductor[52436]: self._ensure_sufficient_hosts( [ 878.013759] nova-conductor[52436]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 878.013759] nova-conductor[52436]: raise exception.NoValidHost(reason=reason) [ 878.013759] nova-conductor[52436]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 878.013759] nova-conductor[52436]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 878.014357] nova-conductor[52436]: WARNING nova.scheduler.utils [None req-b9f8a191-842e-41be-8097-d8f0ec79c619 tempest-AttachVolumeNegativeTest-1762778477 tempest-AttachVolumeNegativeTest-1762778477-project-member] [instance: 59354370-8fc9-4933-ba88-f2886434eb5c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available.