[ 467.695732] nova-conductor[51912]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 468.910572] nova-conductor[51912]: INFO dbcounter [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Registered counter for database nova_api [ 468.919112] nova-conductor[51912]: DEBUG oslo_db.sqlalchemy.engines [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51912) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 468.922148] nova-conductor[51912]: DEBUG dbcounter [-] [51912] Writer thread running {{(pid=51912) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 468.944704] nova-conductor[51912]: DEBUG nova.context [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),59419fe5-2c5b-4004-8c8e-a51d243fef53(cell1) {{(pid=51912) load_cells /opt/stack/nova/nova/context.py:464}} [ 468.946404] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51912) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 468.946609] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51912) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 468.947057] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=51912) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 468.947391] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=51912) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 468.947571] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=51912) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 468.948492] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=51912) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 468.949838] nova-conductor[51912]: INFO dbcounter [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Registered counter for database nova_cell0 [ 468.952150] nova-conductor[51912]: INFO dbcounter [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Registered counter for database nova_cell1 [ 468.954551] nova-conductor[51912]: DEBUG oslo_db.sqlalchemy.engines [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51912) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 468.954885] nova-conductor[51912]: DEBUG oslo_db.sqlalchemy.engines [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=51912) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 468.957729] nova-conductor[51912]: DEBUG dbcounter [-] [51912] Writer thread running {{(pid=51912) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 468.958366] nova-conductor[51912]: DEBUG dbcounter [-] [51912] Writer thread running {{(pid=51912) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 469.013625] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Acquiring lock "singleton_lock" {{(pid=51912) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 469.013795] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Acquired lock "singleton_lock" {{(pid=51912) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 469.014031] nova-conductor[51912]: DEBUG oslo_concurrency.lockutils [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Releasing lock "singleton_lock" {{(pid=51912) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 469.014449] nova-conductor[51912]: INFO oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Starting 2 workers [ 469.018679] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Started child 52331 {{(pid=51912) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 469.024139] nova-conductor[52331]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 469.024400] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Started child 52332 {{(pid=51912) _start_child /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:575}} [ 469.024400] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Full set of CONF: {{(pid=51912) wait /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:649}} [ 469.024400] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ******************************************************************************** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 469.024400] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] Configuration options gathered from: {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 469.024400] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] command line args: ['--config-file', '/etc/nova/nova.conf'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 469.024400] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] config files: ['/etc/nova/nova.conf'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 469.024600] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ================================================================================ {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 469.024600] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] allow_resize_to_same_host = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.024854] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] arq_binding_timeout = 300 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.024936] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] block_device_allocate_retries = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.025119] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] block_device_allocate_retries_interval = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.025323] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cert = self.pem {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.025493] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute_driver = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.025738] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute_monitors = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.026126] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] config_dir = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.026194] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] config_drive_format = iso9660 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.026323] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] config_file = ['/etc/nova/nova.conf'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.026517] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] config_source = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.026705] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] console_host = devstack {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.026898] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] control_exchange = nova {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.027084] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cpu_allocation_ratio = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.027255] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] daemon = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.027445] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] debug = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.027618] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] default_access_ip_network_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.027673] nova-conductor[52332]: INFO nova.service [-] Starting conductor node (version 0.1.0) [ 469.027853] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] default_availability_zone = nova {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.027977] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] default_ephemeral_format = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.028268] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.028455] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] default_schedule_zone = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.028505] nova-conductor[52331]: INFO dbcounter [None req-144190a3-f362-4f9f-861c-5940b729406f None None] Registered counter for database nova_cell1 [ 469.028620] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] disk_allocation_ratio = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.028791] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] enable_new_services = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.029009] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] enabled_apis = ['osapi_compute'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.029184] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] enabled_ssl_apis = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.029448] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] flat_injected = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.029523] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] force_config_drive = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.029692] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] force_raw_images = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.029857] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] graceful_shutdown_timeout = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.030036] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] heal_instance_info_cache_interval = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.030502] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] host = devstack {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.030707] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] initial_cpu_allocation_ratio = 4.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.030891] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] initial_disk_allocation_ratio = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.031051] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] initial_ram_allocation_ratio = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.031295] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.031462] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instance_build_timeout = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.031639] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instance_delete_interval = 300 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.031809] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instance_format = [instance: %(uuid)s] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.031968] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instance_name_template = instance-%08x {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.032155] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instance_usage_audit = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.032353] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instance_usage_audit_period = month {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.032536] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.032719] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] instances_path = /opt/stack/data/nova/instances {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.032896] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] internal_service_availability_zone = internal {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.033254] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] key = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.033254] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] live_migration_retry_count = 30 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.033388] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_config_append = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.033582] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.033743] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_dir = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.033928] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.034055] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_options = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.034231] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_rotate_interval = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.034425] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_rotate_interval_type = days {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.034615] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] log_rotation_type = none {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.034747] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.034814] nova-conductor[52331]: DEBUG oslo_db.sqlalchemy.engines [None req-144190a3-f362-4f9f-861c-5940b729406f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52331) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 469.034881] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.035040] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.035212] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.035335] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.035529] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] long_rpc_timeout = 1800 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.035701] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] max_concurrent_builds = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.035853] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] max_concurrent_live_migrations = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.035943] nova-conductor[52332]: INFO dbcounter [None req-5b059885-c693-439d-82f8-8e44a5dcbe93 None None] Registered counter for database nova_cell1 [ 469.036019] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] max_concurrent_snapshots = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.036290] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] max_local_block_devices = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.036359] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] max_logfile_count = 30 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.036514] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] max_logfile_size_mb = 200 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.036670] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] maximum_instance_delete_attempts = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.036865] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] metadata_listen = 0.0.0.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.037096] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] metadata_listen_port = 8775 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.037285] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] metadata_workers = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.037447] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] migrate_max_retries = -1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.037609] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] mkisofs_cmd = genisoimage {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.037807] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] my_block_storage_ip = 10.180.1.21 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.037933] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] my_ip = 10.180.1.21 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.038089] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] network_allocate_retries = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.038278] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.038456] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] osapi_compute_listen = 0.0.0.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.038615] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] osapi_compute_listen_port = 8774 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.038791] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] osapi_compute_unique_server_name_scope = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.038966] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] osapi_compute_workers = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.039131] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] password_length = 12 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.039298] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] periodic_enable = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.039451] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] periodic_fuzzy_delay = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.039493] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writer thread running {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 469.039632] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] pointer_model = usbtablet {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.039814] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] preallocate_images = none {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.039981] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] publish_errors = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.040118] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] pybasedir = /opt/stack/nova {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.040286] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ram_allocation_ratio = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.040442] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rate_limit_burst = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.040596] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rate_limit_except_level = CRITICAL {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041009] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rate_limit_interval = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041009] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] reboot_timeout = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041085] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] reclaim_instance_interval = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041226] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] record = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041375] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] reimage_timeout_per_gb = 20 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041527] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] report_interval = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041674] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rescue_timeout = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041822] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] reserved_host_cpus = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.041970] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] reserved_host_disk_mb = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.042136] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] reserved_host_memory_mb = 512 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.042334] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] reserved_huge_pages = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.042468] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] resize_confirm_window = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.042610] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] resize_fs_using_block_device = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.042757] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] resume_guests_state_on_host_boot = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.042929] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.043111] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rpc_response_timeout = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.043288] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] run_external_periodic_tasks = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.043463] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] running_deleted_instance_action = reap {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.043619] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] running_deleted_instance_poll_interval = 1800 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.043772] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] running_deleted_instance_timeout = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.043938] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler_instance_sync_interval = 120 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.044111] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_down_time = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.044290] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] servicegroup_driver = db {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.044441] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] shelved_offload_time = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.044504] nova-conductor[52332]: DEBUG oslo_db.sqlalchemy.engines [None req-5b059885-c693-439d-82f8-8e44a5dcbe93 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52332) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 469.044590] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] shelved_poll_interval = 3600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.044752] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] shutdown_timeout = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.044928] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] source_is_ipv6 = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.045079] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ssl_only = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.045234] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] state_path = /opt/stack/data/nova {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.045392] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] sync_power_state_interval = 600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.045547] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] sync_power_state_pool_size = 1000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.045726] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] syslog_log_facility = LOG_USER {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.045905] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] tempdir = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.046090] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] timeout_nbd = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.046257] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] transport_url = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.046406] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] update_resources_interval = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.046565] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] use_cow_images = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.046746] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] use_eventlog = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.046898] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] use_journal = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.047098] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] use_json = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.047291] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] use_rootwrap_daemon = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.047479] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] use_stderr = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.047663] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] use_syslog = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.047812] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vcpu_pin_set = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.047981] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vif_plugging_is_fatal = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.048207] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vif_plugging_timeout = 300 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.048436] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] virt_mkfs = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.048619] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] volume_usage_poll_interval = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.048774] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] watch_log_file = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.048965] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] web = /usr/share/spice-html5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 469.049957] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_concurrency.disable_process_locking = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.050157] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.050354] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.050516] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.050683] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_metrics.metrics_process_name = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.050846] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.051002] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.051221] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.auth_strategy = keystone {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.051381] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.compute_link_prefix = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.051563] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.051732] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.dhcp_domain = novalocal {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.051890] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.enable_instance_password = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.052067] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.glance_link_prefix = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.052245] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.052372] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writer thread running {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 469.052418] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.instance_list_cells_batch_strategy = distributed {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.052578] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.instance_list_per_project_cells = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.052734] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.list_records_by_skipping_down_cells = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.052895] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.local_metadata_per_cell = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.053057] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.max_limit = 1000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.053233] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.metadata_cache_expiration = 15 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.053397] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.neutron_default_tenant_id = default {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.053836] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.use_forwarded_for = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.053836] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.use_neutron_default_nets = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.053912] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.054028] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.vendordata_dynamic_failure_fatal = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.054185] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.054369] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.vendordata_dynamic_ssl_certfile = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.054532] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.vendordata_dynamic_targets = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.054709] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.vendordata_jsonfile_path = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.054900] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api.vendordata_providers = ['StaticJSON'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.055147] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.backend = dogpile.cache.memcached {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.055309] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.backend_argument = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.055484] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.config_prefix = cache.oslo {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.055677] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.dead_timeout = 60.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.056244] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.debug_cache_backend = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.056419] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.enable_retry_client = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.056572] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.enable_socket_keepalive = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.056734] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.enabled = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.056924] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.expiration_time = 600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.057083] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.hashclient_retry_attempts = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.057240] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.hashclient_retry_delay = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.057397] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_dead_retry = 300 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.057556] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_password = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.057711] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.057865] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.058016] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_pool_maxsize = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.058184] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_pool_unused_timeout = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.058337] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_sasl_enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.058518] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_servers = ['localhost:11211'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.058688] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_socket_timeout = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.058862] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.memcache_username = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.059024] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.proxies = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.059177] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.retry_attempts = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.059335] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.retry_delay = 0.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.059497] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.socket_keepalive_count = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.059662] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.socket_keepalive_idle = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.059815] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.socket_keepalive_interval = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.059961] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.tls_allowed_ciphers = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.060123] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.tls_cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.060204] nova-conductor[52331]: DEBUG nova.service [None req-144190a3-f362-4f9f-861c-5940b729406f None None] Creating RPC server for service conductor {{(pid=52331) start /opt/stack/nova/nova/service.py:182}} [ 469.060269] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.tls_certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.060420] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.tls_enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.060566] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cache.tls_keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.060788] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.060974] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.auth_type = password {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.061146] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.061330] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.catalog_info = volumev3::publicURL {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.061483] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.061636] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.061811] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.cross_az_attach = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.061976] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.debug = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.062130] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.endpoint_template = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.062282] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.http_retries = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.062439] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.062588] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.062763] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.os_region_name = RegionOne {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.062920] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.063074] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cinder.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.063255] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.063407] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.cpu_dedicated_set = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.063558] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.cpu_shared_set = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.063713] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.image_type_exclude_list = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.063871] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.live_migration_wait_for_vif_plug = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.064041] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.max_concurrent_disk_ops = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.064196] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.max_disk_devices_to_attach = -1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.064368] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.064527] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.064685] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.resource_provider_association_refresh = 300 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.064848] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.shutdown_retry_interval = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.065022] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.065190] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] conductor.workers = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.065361] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] console.allowed_origins = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.065513] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] console.ssl_ciphers = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.065673] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] console.ssl_minimum_version = default {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.065843] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] consoleauth.token_ttl = 600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.066034] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.066187] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.066341] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.066490] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.connect_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.066639] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.connect_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.066788] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.endpoint_override = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.066900] nova-conductor[52332]: DEBUG nova.service [None req-5b059885-c693-439d-82f8-8e44a5dcbe93 None None] Creating RPC server for service conductor {{(pid=52332) start /opt/stack/nova/nova/service.py:182}} [ 469.066947] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.067088] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.067236] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.max_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.067385] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.min_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.067533] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.region_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.067683] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.service_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.067852] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.service_type = accelerator {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.068023] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.068174] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.status_code_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.068323] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.status_code_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.068484] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.068672] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.068831] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] cyborg.version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.069016] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.backend = sqlalchemy {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.069413] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.connection = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.069413] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.connection_debug = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.069532] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.connection_parameters = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.069673] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.connection_recycle_time = 3600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.069830] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.connection_trace = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.069983] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.db_inc_retry_interval = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.070141] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.db_max_retries = 20 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.070296] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.db_max_retry_interval = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.070452] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.db_retry_interval = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.070613] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.max_overflow = 50 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.070764] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.max_pool_size = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.070932] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.max_retries = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.071084] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.mysql_enable_ndb = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.071244] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.mysql_sql_mode = TRADITIONAL {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.071392] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.mysql_wsrep_sync_wait = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.071544] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.pool_timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.071706] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.retry_interval = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.071856] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.slave_connection = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.072025] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.sqlite_synchronous = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.072187] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] database.use_db_reconnect = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.072357] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.backend = sqlalchemy {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.072528] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.connection = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.072687] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.connection_debug = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.072848] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.connection_parameters = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.073003] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.connection_recycle_time = 3600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.073158] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.connection_trace = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.073308] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.db_inc_retry_interval = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.073463] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.db_max_retries = 20 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.073657] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.db_max_retry_interval = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.073831] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.db_retry_interval = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.074019] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.max_overflow = 50 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.074183] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.max_pool_size = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.074341] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.max_retries = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.074820] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.mysql_enable_ndb = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.074820] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.074820] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.mysql_wsrep_sync_wait = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.074979] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.pool_timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.075122] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.retry_interval = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.075271] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.slave_connection = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.075426] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] api_database.sqlite_synchronous = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.075620] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] devices.enabled_mdev_types = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.075789] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.076044] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ephemeral_storage_encryption.enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.076157] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ephemeral_storage_encryption.key_size = 512 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.076368] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.api_servers = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.076678] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.076678] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.076837] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.077114] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.connect_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.077187] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.connect_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.077331] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.debug = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.077522] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.default_trusted_certificate_ids = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.077679] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.enable_certificate_validation = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.077845] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.enable_rbd_download = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.078017] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.endpoint_override = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.078206] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.078300] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.078439] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.max_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.078642] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.min_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.078786] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.num_retries = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.078950] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.rbd_ceph_conf = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.079104] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.rbd_connect_timeout = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.079267] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.rbd_pool = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.079480] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.rbd_user = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.079648] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.region_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.079803] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.service_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.080180] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.service_type = image {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.080180] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.080284] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.status_code_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.080432] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.status_code_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.080585] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.080762] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.080922] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.verify_glance_signatures = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.081150] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] glance.version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.081277] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] guestfs.debug = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.081458] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.config_drive_cdrom = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.081619] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.config_drive_inject_password = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.081809] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.081979] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.enable_instance_metrics_collection = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.082158] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.enable_remotefx = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.082336] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.instances_path_share = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.082501] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.iscsi_initiator_list = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.082664] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.limit_cpu_features = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.082826] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.082984] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.083143] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.power_state_check_timeframe = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.083303] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.power_state_event_polling_interval = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.083500] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.083670] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.use_multipath_io = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.083842] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.volume_attach_retry_count = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.084070] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.volume_attach_retry_interval = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.084175] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.vswitch_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.084336] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.084533] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] mks.enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.085125] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.085226] nova-conductor[52331]: DEBUG nova.service [None req-144190a3-f362-4f9f-861c-5940b729406f None None] Join ServiceGroup membership for this service conductor {{(pid=52331) start /opt/stack/nova/nova/service.py:199}} [ 469.085341] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] image_cache.manager_interval = 2400 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.085482] nova-conductor[52331]: DEBUG nova.servicegroup.drivers.db [None req-144190a3-f362-4f9f-861c-5940b729406f None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52331) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 469.085522] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] image_cache.precache_concurrency = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.085667] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] image_cache.remove_unused_base_images = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.085862] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.086032] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.086207] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] image_cache.subdirectory_name = _base {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.086428] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.api_max_retries = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.086590] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.api_retry_interval = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.086794] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.086987] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.auth_type = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.087148] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.087315] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.087507] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.087665] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.connect_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.087820] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.connect_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.087974] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.endpoint_override = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.088147] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.088300] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.088456] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.max_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.088607] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.min_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.088785] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.partition_key = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.088946] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.peer_list = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.089136] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.region_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.089325] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.serial_console_state_timeout = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.089484] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.service_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.089696] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.service_type = baremetal {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.089861] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.090039] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.status_code_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.090220] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.status_code_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.090379] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.090583] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.090741] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ironic.version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.090979] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.091196] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] key_manager.fixed_key = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.091419] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.091595] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.barbican_api_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.091765] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.barbican_endpoint = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.091953] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.barbican_endpoint_type = public {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.092143] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.barbican_region_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.092299] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.092454] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.092644] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.092800] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.092953] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.093115] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.number_of_retries = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.093295] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.retry_delay = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.093482] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.send_service_user_token = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.093642] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.093779] nova-conductor[52332]: DEBUG nova.service [None req-5b059885-c693-439d-82f8-8e44a5dcbe93 None None] Join ServiceGroup membership for this service conductor {{(pid=52332) start /opt/stack/nova/nova/service.py:199}} [ 469.093821] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.093945] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.verify_ssl = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.094098] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican.verify_ssl_path = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.094136] nova-conductor[52332]: DEBUG nova.servicegroup.drivers.db [None req-5b059885-c693-439d-82f8-8e44a5dcbe93 None None] DB_Driver: join new ServiceGroup member devstack to the conductor group, service = {{(pid=52332) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 469.094285] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.094487] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.auth_type = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.094647] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.094829] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.095009] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.095169] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.095322] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.095481] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.095634] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] barbican_service_user.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.095823] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.approle_role_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.095983] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.approle_secret_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.096153] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.096305] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.096461] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.096614] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.096764] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.096951] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.kv_mountpoint = secret {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.097116] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.kv_version = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.097273] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.namespace = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.097429] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.root_token_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.097585] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.097736] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.ssl_ca_crt_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.097891] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.098070] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.use_ssl = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.098249] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.098434] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.098594] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.098754] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.098908] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.connect_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.099064] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.connect_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.099216] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.endpoint_override = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.099390] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.099545] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.099697] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.max_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.099847] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.min_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.100070] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.region_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.100175] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.service_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.100338] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.service_type = identity {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.100496] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.100648] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.status_code_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.100802] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.status_code_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.100959] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.101134] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.101290] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] keystone.version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.101534] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.connection_uri = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.101718] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.cpu_mode = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.101882] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.cpu_model_extra_flags = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.102047] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.cpu_models = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.102215] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.cpu_power_governor_high = performance {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.102378] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.cpu_power_governor_low = powersave {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.102539] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.cpu_power_management = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.102723] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.102907] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.device_detach_attempts = 8 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.103069] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.device_detach_timeout = 20 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.103232] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.disk_cachemodes = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.103386] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.disk_prefix = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.103544] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.enabled_perf_events = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.103702] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.file_backed_memory = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.103876] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.gid_maps = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.104059] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.hw_disk_discard = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.104217] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.hw_machine_type = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.104385] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.images_rbd_ceph_conf = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.104542] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.104700] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.104881] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.images_rbd_glance_store_name = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.105042] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.images_rbd_pool = rbd {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.105206] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.images_type = default {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.105363] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.images_volume_group = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.105519] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.inject_key = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.105675] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.inject_partition = -2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.105852] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.inject_password = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.106028] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.iscsi_iface = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.106187] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.iser_use_multipath = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.106347] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_bandwidth = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.106505] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_completion_timeout = 800 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.106662] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_downtime = 500 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.106821] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_downtime_delay = 75 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.106978] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_downtime_steps = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.107137] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_inbound_addr = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.107290] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_permit_auto_converge = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.107443] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_permit_post_copy = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.107594] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_scheme = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.107759] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_timeout_action = abort {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.107917] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_tunnelled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.108084] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_uri = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.108260] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.live_migration_with_native_tls = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.108437] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.max_queues = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.108599] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.mem_stats_period_seconds = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.108753] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.nfs_mount_options = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.109025] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.109415] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.num_aoe_discover_tries = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.109610] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.num_iser_scan_tries = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.109771] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.num_memory_encrypted_guests = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.109936] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.num_nvme_discover_tries = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.110098] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.num_pcie_ports = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.110262] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.num_volume_scan_tries = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.110477] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.pmem_namespaces = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.110641] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.quobyte_client_cfg = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.110868] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.111039] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rbd_connect_timeout = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.111199] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.111358] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.111513] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rbd_secret_uuid = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.111666] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rbd_user = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.111824] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.realtime_scheduler_priority = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.111994] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.remote_filesystem_transport = ssh {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.112171] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rescue_image_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.112326] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rescue_kernel_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.112478] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rescue_ramdisk_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.112642] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rng_dev_path = /dev/urandom {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.112795] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.rx_queue_size = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.112957] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.smbfs_mount_options = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.113177] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.113340] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.snapshot_compression = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.113495] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.snapshot_image_format = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.113704] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.113871] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.sparse_logical_volumes = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.114027] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.swtpm_enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.114210] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.swtpm_group = tss {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.114373] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.swtpm_user = tss {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.114536] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.sysinfo_serial = unique {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.114691] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.tx_queue_size = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.114868] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.uid_maps = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.115025] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.use_virtio_for_bridges = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.115193] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.virt_type = kvm {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.115357] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.volume_clear = zero {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.115514] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.volume_clear_size = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.115673] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.volume_use_multipath = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.115824] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.vzstorage_cache_path = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.115984] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.116166] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.vzstorage_mount_group = qemu {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.116331] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.vzstorage_mount_opts = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.116490] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.116700] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.116871] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.vzstorage_mount_user = stack {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.117034] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.117222] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.117391] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.auth_type = password {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.117549] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.117704] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.117863] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.118035] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.connect_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.118189] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.connect_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.118355] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.default_floating_pool = public {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.118509] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.endpoint_override = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.118670] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.extension_sync_interval = 600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.118829] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.http_retries = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.118990] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.119143] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.119295] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.max_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.119460] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.metadata_proxy_shared_secret = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.119614] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.min_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.119775] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.ovs_bridge = br-int {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.119966] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.physnets = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.120151] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.region_name = RegionOne {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.120317] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.service_metadata_proxy = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.120470] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.service_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.120635] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.service_type = network {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.120795] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.120949] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.status_code_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.121104] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.status_code_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.121256] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.121430] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.121586] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] neutron.version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.121752] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] notifications.bdms_in_notifications = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.121925] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] notifications.default_level = INFO {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.122097] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] notifications.notification_format = unversioned {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.122255] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] notifications.notify_on_state_change = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.122423] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.122594] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] pci.alias = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.122755] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] pci.device_spec = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.122913] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] pci.report_in_placement = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.123117] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.123290] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.auth_type = password {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.123455] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.auth_url = http://10.180.1.21/identity {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.123608] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.123760] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.123930] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.124092] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.connect_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.124245] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.connect_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.124395] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.default_domain_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.124548] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.default_domain_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.124698] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.domain_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.124849] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.domain_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.125002] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.endpoint_override = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.125158] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.125308] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.125457] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.max_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.125606] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.min_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.125766] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.password = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.125918] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.project_domain_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.126080] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.project_domain_name = Default {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.126240] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.project_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.126406] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.project_name = service {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.126569] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.region_name = RegionOne {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.126723] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.service_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.126889] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.service_type = placement {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.127050] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.127203] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.status_code_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.127357] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.status_code_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.127509] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.system_scope = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.127661] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.127830] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.trust_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.127965] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.user_domain_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.128144] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.user_domain_name = Default {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.128298] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.user_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.128469] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.username = placement {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.128640] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.128795] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] placement.version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.128989] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.cores = 20 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.129150] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.count_usage_from_placement = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.129317] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.129498] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.injected_file_content_bytes = 10240 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.129659] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.injected_file_path_length = 255 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.129824] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.injected_files = 5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.129985] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.instances = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.130147] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.key_pairs = 100 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.130307] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.metadata_items = 128 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.130468] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.ram = 51200 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.130626] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.recheck_quota = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.130789] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.server_group_members = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.130952] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] quota.server_groups = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.131118] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rdp.enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.131428] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.131710] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.131809] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.131994] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.image_metadata_prefilter = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.132202] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.132378] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.max_attempts = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.132555] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.max_placement_results = 1000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.132730] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.132890] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.query_placement_for_availability_zone = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.133048] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.query_placement_for_image_type_support = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.133232] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.133402] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] scheduler.workers = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.133593] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.133760] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.133965] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.134132] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.134307] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.134476] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.134628] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.134839] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.135003] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.host_subset_size = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.135159] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.image_properties_default_architecture = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.135319] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.135487] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.isolated_hosts = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.135665] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.isolated_images = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.135826] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.max_instances_per_host = 50 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.135983] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.136157] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.pci_in_placement = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.136316] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.136496] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.136661] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.136819] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.136978] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.137138] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.137315] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.track_instance_changes = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.137781] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.137781] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] metrics.required = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.137910] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] metrics.weight_multiplier = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.137972] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] metrics.weight_of_unavailable = -10000.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.138131] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] metrics.weight_setting = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.138441] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.138606] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] serial_console.enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.138794] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] serial_console.port_range = 10000:20000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.138963] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.139131] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.139321] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] serial_console.serialproxy_port = 6083 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.139491] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.139659] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.auth_type = password {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.139814] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.139967] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.140140] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.140303] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.140455] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.140620] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.send_service_user_token = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.140778] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.140931] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] service_user.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.141100] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.agent_enabled = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.141285] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.141598] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.141823] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.html5proxy_host = 0.0.0.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.141996] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.html5proxy_port = 6082 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.142155] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.image_compression = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.142315] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.jpeg_compression = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.142490] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.playback_compression = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.142658] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.server_listen = 127.0.0.1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.142821] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.142978] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.streaming_mode = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.143134] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] spice.zlib_compression = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.143296] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] upgrade_levels.baseapi = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.143450] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] upgrade_levels.cert = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.143612] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] upgrade_levels.compute = auto {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.143765] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] upgrade_levels.conductor = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.143928] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] upgrade_levels.scheduler = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.144106] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.144336] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.auth_type = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.144427] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.144579] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.144742] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.144897] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.145051] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.145209] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.145359] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vendordata_dynamic_auth.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.145566] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.api_retry_count = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.145729] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.ca_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.145896] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.cache_prefix = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.146050] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.cluster_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.146222] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.connection_pool_size = 10 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.146376] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.console_delay_seconds = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.146527] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.datastore_regex = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.146683] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.host_ip = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.146842] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.host_password = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.147000] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.host_port = 443 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.147159] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.host_username = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.147316] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.147470] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.integration_bridge = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.147635] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.maximum_objects = 100 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.147786] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.pbm_default_policy = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.147965] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.pbm_enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.148142] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.pbm_wsdl_location = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.148310] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.148467] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.serial_port_proxy_uri = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.148620] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.serial_port_service_uri = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.148784] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.task_poll_interval = 0.5 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.148946] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.use_linked_clone = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.149111] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.vnc_keymap = en-us {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.149274] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.vnc_port = 5900 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.149433] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vmware.vnc_port_total = 10000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.149632] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.auth_schemes = ['none'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.149799] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.enabled = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.150102] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.150287] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.150475] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.novncproxy_port = 6080 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.150656] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.server_listen = 127.0.0.1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.150821] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.150978] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.vencrypt_ca_certs = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.151135] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.vencrypt_client_cert = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.151286] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] vnc.vencrypt_client_key = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.151501] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.151660] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.disable_fallback_pcpu_query = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.151819] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.disable_group_policy_check_upcall = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.151973] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.152574] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.disable_rootwrap = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.152574] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.enable_numa_live_migration = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.152574] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.152574] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.152751] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.handle_virt_lifecycle_events = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.152882] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.libvirt_disable_apic = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.153035] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.never_download_image_if_on_rbd = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.153193] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.153349] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.153502] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.153656] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.153810] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.153967] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.154123] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.154279] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.154435] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.154640] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.154810] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.client_socket_timeout = 900 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.154975] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.default_pool_size = 1000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.155137] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.keep_alive = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.155303] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.max_header_line = 16384 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.155462] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.secure_proxy_ssl_header = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.155615] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.ssl_ca_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.155770] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.ssl_cert_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.155926] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.ssl_key_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.156157] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.tcp_keepidle = 600 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.156285] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.156474] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] zvm.ca_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.156632] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] zvm.cloud_connector_url = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.156850] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.157015] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] zvm.reachable_timeout = 300 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.157243] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.enforce_new_defaults = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.157416] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.enforce_scope = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.157602] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.policy_default_rule = default {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.158362] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.158362] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.policy_file = policy.yaml {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.158362] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.158362] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.158530] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.158651] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.158833] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.159026] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.159216] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.159423] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.connection_string = messaging:// {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.159596] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.enabled = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.159775] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.es_doc_type = notification {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.159954] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.es_scroll_size = 10000 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.160133] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.es_scroll_time = 2m {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.160294] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.filter_error_trace = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.160457] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.hmac_keys = SECRET_KEY {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.160618] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.sentinel_service_name = mymaster {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.160799] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.socket_timeout = 0.1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.160960] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] profiler.trace_sqlalchemy = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.161153] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] remote_debug.host = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.161330] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] remote_debug.port = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.161514] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.161677] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.161844] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.162002] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.162165] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.162338] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.162503] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.162661] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.162820] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.162975] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.163144] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.163304] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.163472] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.163633] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.163791] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.163966] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.164137] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.164295] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.164456] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.164612] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.164773] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.164931] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.165090] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.165248] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.165408] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.165573] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.ssl = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.165736] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.165900] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.166061] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.166244] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.166410] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_rabbit.ssl_version = {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.166618] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.166787] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_notifications.retry = -1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.166962] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.167137] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_messaging_notifications.transport_url = **** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.167332] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.auth_section = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.167489] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.auth_type = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.167642] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.cafile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.167794] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.certfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.167950] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.collect_timing = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.168116] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.connect_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.168270] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.connect_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.168421] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.endpoint_id = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.168576] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.endpoint_override = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.168733] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.insecure = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.168886] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.keyfile = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.169035] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.max_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.169186] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.min_version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.169338] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.region_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.169509] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.service_name = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.169658] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.service_type = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.169815] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.split_loggers = False {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.169969] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.status_code_retries = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.170119] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.status_code_retry_delay = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.170270] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.timeout = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.170439] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.valid_interfaces = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.170590] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_limit.version = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.170765] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_reports.file_event_handler = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.170946] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_reports.file_event_handler_interval = 1 {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.171101] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] oslo_reports.log_dir = None {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 469.171224] nova-conductor[51912]: DEBUG oslo_service.service [None req-a1dad1a8-526a-488d-89ab-163bd1d2ca9a None None] ******************************************************************************** {{(pid=51912) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 478.943508] nova-conductor[51912]: DEBUG dbcounter [-] [51912] Writing DB stats nova_api:SELECT=3 {{(pid=51912) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 479.009514] nova-conductor[51912]: DEBUG dbcounter [-] [51912] Writing DB stats nova_cell0:SELECT=3 {{(pid=51912) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 479.016692] nova-conductor[51912]: DEBUG dbcounter [-] [51912] Writing DB stats nova_cell1:SELECT=3 {{(pid=51912) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 494.099694] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=9 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 504.082094] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=15,nova_cell1:INSERT=1,nova_cell1:UPDATE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 504.107107] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 514.114218] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 524.087558] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=3,nova_cell1:UPDATE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 524.118843] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 534.121679] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 544.088714] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=4,nova_cell1:UPDATE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 544.124975] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 554.094604] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=2,nova_cell1:UPDATE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 554.130505] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 555.466765] nova-conductor[52332]: DEBUG oslo_db.sqlalchemy.engines [None req-486672b5-609a-4b62-b02a-983cbe760ff1 None None] Parent process 51912 forked (52332) with an open database connection, which is being discarded and recreated. {{(pid=52332) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 555.469074] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writer thread running {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 555.470148] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 565.470567] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 574.100588] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=19,nova_cell1:UPDATE=3,nova_cell1:INSERT=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 574.140432] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=19,nova_cell1:INSERT=1,nova_cell1:UPDATE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 584.152262] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 593.829784] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Took 0.87 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 593.853394] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.853704] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.855310] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.857446] nova-conductor[52332]: INFO dbcounter [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Registered counter for database nova_cell1 [ 593.860168] nova-conductor[52332]: DEBUG oslo_db.sqlalchemy.engines [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52332) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 593.862435] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writer thread running {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 593.925726] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.925951] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.926682] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.927083] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.927275] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.927436] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.929085] nova-conductor[52332]: INFO dbcounter [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Registered counter for database nova_cell0 [ 593.936690] nova-conductor[52332]: DEBUG oslo_db.sqlalchemy.engines [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52332) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 593.939732] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writer thread running {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 593.951328] nova-conductor[52332]: DEBUG nova.quota [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Getting quotas for project 7eba5dc09e4746f58feb68bb03e32bd8. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 593.954727] nova-conductor[52332]: DEBUG nova.quota [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Getting quotas for user 10e7be25460642628fb69aed0ba2ac2e and project 7eba5dc09e4746f58feb68bb03e32bd8. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 593.961420] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] [instance: 72a3ff61-01d0-444c-a32d-d9908e0c9c74] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 593.962089] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.962226] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.962319] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.969218] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] [instance: 72a3ff61-01d0-444c-a32d-d9908e0c9c74] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 593.969929] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.970132] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.970294] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.997129] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.997362] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.997529] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.997801] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52332) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 593.997959] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52332) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 593.998483] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.998679] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.998876] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.999258] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.999424] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.999682] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.010170] nova-conductor[52332]: INFO nova.compute.rpcapi [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 594.010170] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f4c1adef-4444-4fbe-bb12-fa4fee494cba None None] Releasing lock "compute-rpcapi-router" {{(pid=52332) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.737013] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] [instance: 72a3ff61-01d0-444c-a32d-d9908e0c9c74] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 77083818-803c-4b50-8f03-3ca049048f14, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 72a3ff61-01d0-444c-a32d-d9908e0c9c74 was re-scheduled: Binding failed for port 77083818-803c-4b50-8f03-3ca049048f14, please check neutron logs for more information.\n'] [ 601.741589] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 601.741589] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72a3ff61-01d0-444c-a32d-d9908e0c9c74.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72a3ff61-01d0-444c-a32d-d9908e0c9c74. [ 601.741589] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] [instance: 72a3ff61-01d0-444c-a32d-d9908e0c9c74] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 72a3ff61-01d0-444c-a32d-d9908e0c9c74. [ 601.790157] nova-conductor[52332]: DEBUG nova.network.neutron [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] [instance: 72a3ff61-01d0-444c-a32d-d9908e0c9c74] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 602.126923] nova-conductor[52332]: DEBUG nova.network.neutron [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] [instance: 72a3ff61-01d0-444c-a32d-d9908e0c9c74] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 602.135625] nova-conductor[52332]: DEBUG nova.network.neutron [None req-758edda8-234a-4ba0-8aa8-145cece231f9 tempest-ServerDiagnosticsTest-1089501081 tempest-ServerDiagnosticsTest-1089501081-project-member] [instance: 72a3ff61-01d0-444c-a32d-d9908e0c9c74] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 602.276931] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Took 0.22 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 602.288099] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.288357] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.288795] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.343208] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.343641] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.344162] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.344653] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.344949] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.345353] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.356320] nova-conductor[52332]: DEBUG nova.quota [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Getting quotas for project d6fbe339521f48ef80dfe3dd94ae1046. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 602.363673] nova-conductor[52332]: DEBUG nova.quota [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Getting quotas for user 0049527d28e746b5a1317b9f14e2e18a and project d6fbe339521f48ef80dfe3dd94ae1046. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 602.375595] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 035048f2-7780-40db-af06-f807b72d2cd8] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 602.376324] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.376535] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.376705] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.385572] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 035048f2-7780-40db-af06-f807b72d2cd8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 602.385572] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.385572] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.385572] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.405192] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.405420] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.405580] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.702188] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=69,nova_cell1:UPDATE=23,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=2,nova_cell1:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 602.800140] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=75,nova_cell1:UPDATE=17,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3,nova_cell1:INSERT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 602.852718] nova-conductor[52331]: DEBUG oslo_db.sqlalchemy.engines [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Parent process 51912 forked (52331) with an open database connection, which is being discarded and recreated. {{(pid=52331) checkout /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:434}} [ 602.856223] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writer thread running {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 603.016025] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 603.026831] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 603.045897] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.046419] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.048165] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.002s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.052149] nova-conductor[52331]: INFO dbcounter [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Registered counter for database nova_cell1 [ 603.055110] nova-conductor[52331]: DEBUG oslo_db.sqlalchemy.engines [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52331) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 603.057289] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writer thread running {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 603.114294] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.114765] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.115643] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.116195] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.116516] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.116955] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.118698] nova-conductor[52331]: INFO dbcounter [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Registered counter for database nova_cell0 [ 603.128864] nova-conductor[52331]: DEBUG oslo_db.sqlalchemy.engines [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=52331) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 603.131115] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writer thread running {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 603.144486] nova-conductor[52331]: DEBUG nova.quota [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Getting quotas for project 0305090c1cdd48cd8c30981c85fe87ba. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 603.148630] nova-conductor[52331]: DEBUG nova.quota [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Getting quotas for user 786399cee42c47d29433fae5ad31ce1b and project 0305090c1cdd48cd8c30981c85fe87ba. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 603.156080] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] [instance: e8b2688d-e97d-4992-86b9-aac79fd73403] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 603.157140] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.157569] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.157980] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.162634] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] [instance: e8b2688d-e97d-4992-86b9-aac79fd73403] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 603.163553] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.163856] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.164501] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.208061] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.208300] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.208476] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.208764] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquiring lock "compute-rpcapi-router" {{(pid=52331) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 603.208882] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Acquired lock "compute-rpcapi-router" {{(pid=52331) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 603.209445] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.209609] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.209775] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.210202] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.210479] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.210733] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.219158] nova-conductor[52331]: INFO nova.compute.rpcapi [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Automatically selected compute RPC version 6.2 from minimum service version 66 [ 603.219443] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-739560b0-1a4a-477d-b4f4-d88d35bfeb92 None None] Releasing lock "compute-rpcapi-router" {{(pid=52331) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 603.576956] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 603.591410] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.591643] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.591816] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.642422] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.642543] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.642711] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.643066] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.643241] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.643393] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.651751] nova-conductor[52332]: DEBUG nova.quota [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Getting quotas for project 1bdd811bd22446b4b0e2a1ef9998943e. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 603.654359] nova-conductor[52332]: DEBUG nova.quota [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Getting quotas for user 019cb3944dd040d6a05d541ee783b343 and project 1bdd811bd22446b4b0e2a1ef9998943e. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 603.661058] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] [instance: 9e1ead6b-e86f-45d9-906d-b2c97138adff] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 603.661580] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.661782] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.661955] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.664802] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] [instance: 9e1ead6b-e86f-45d9-906d-b2c97138adff] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 603.665445] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.665644] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.665819] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.667375] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=33,nova_cell1:SAVEPOINT=3,nova_cell1:INSERT=61,nova_cell1:RELEASE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 603.718287] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.718517] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.718791] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.210106] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 604.264998] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.264998] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.264998] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.320978] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.321357] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.321639] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.322506] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.322795] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.323059] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.335863] nova-conductor[52331]: DEBUG nova.quota [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Getting quotas for project 5439b91021b94bd6a164191b0ea92edb. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 604.338436] nova-conductor[52331]: DEBUG nova.quota [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Getting quotas for user 564aea2851b8477ba7383153ac1394c2 and project 5439b91021b94bd6a164191b0ea92edb. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 604.347011] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] [instance: 5a5ada63-8ae0-40e9-96b1-d60b040ec42b] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 604.348107] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.348478] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.349022] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.352739] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] [instance: 5a5ada63-8ae0-40e9-96b1-d60b040ec42b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 604.354341] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.354976] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.355280] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.382545] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.382767] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.382932] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.845250] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=77,nova_cell1:UPDATE=21,nova_cell1:INSERT=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 605.625719] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 605.646417] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.646686] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.646858] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.688501] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.689096] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.691322] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.691322] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.691322] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.691322] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.698585] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=21,nova_cell1:SELECT=75,nova_cell1:INSERT=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 605.700703] nova-conductor[52332]: DEBUG nova.quota [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Getting quotas for project c21bbc06a1c34f03a61acb1ace2c287b. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 605.703237] nova-conductor[52332]: DEBUG nova.quota [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Getting quotas for user 51ff751cb3154ff4b558ac64522e1985 and project c21bbc06a1c34f03a61acb1ace2c287b. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 605.709132] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] [instance: d7d47cc0-277b-4b8e-9cce-c5c9e4099205] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 605.709731] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.709896] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.710060] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.713013] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] [instance: d7d47cc0-277b-4b8e-9cce-c5c9e4099205] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 605.714589] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.714793] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.714954] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.735868] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.736083] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.736263] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.168199] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Took 0.17 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 609.182346] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.182693] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.182992] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.221405] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.221646] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.222335] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.223037] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.223423] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.223809] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.237424] nova-conductor[52332]: DEBUG nova.quota [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Getting quotas for project a530701f7c79417f9a29f90bd3905fd6. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 609.242053] nova-conductor[52332]: DEBUG nova.quota [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Getting quotas for user 7ee3bb08b32743dfa6a330a96a77b1a5 and project a530701f7c79417f9a29f90bd3905fd6. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 609.251328] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] [instance: 1d44d43a-86a5-4415-bcdd-afd2790adaee] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 609.252679] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.252868] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.253287] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.259926] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] [instance: 1d44d43a-86a5-4415-bcdd-afd2790adaee] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 609.261163] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.263248] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.263248] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.280326] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.280556] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.280721] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.541443] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 035048f2-7780-40db-af06-f807b72d2cd8] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 87429c08-63fb-482c-87ed-e16a2232d8cc, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 035048f2-7780-40db-af06-f807b72d2cd8 was re-scheduled: Binding failed for port 87429c08-63fb-482c-87ed-e16a2232d8cc, please check neutron logs for more information.\n'] [ 610.546580] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 610.547037] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 035048f2-7780-40db-af06-f807b72d2cd8.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 035048f2-7780-40db-af06-f807b72d2cd8. [ 610.547496] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 035048f2-7780-40db-af06-f807b72d2cd8] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 035048f2-7780-40db-af06-f807b72d2cd8. [ 610.590721] nova-conductor[52332]: DEBUG nova.network.neutron [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 035048f2-7780-40db-af06-f807b72d2cd8] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 610.601349] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 610.626486] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.626717] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.626885] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.669896] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.670088] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.670253] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.670609] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.670794] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.670967] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.680768] nova-conductor[52332]: DEBUG nova.quota [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Getting quotas for project e10646c39f5c4532a2de09254970ffbd. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 610.683494] nova-conductor[52332]: DEBUG nova.quota [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Getting quotas for user bfe0c015632f47fbb66cdd80cef91dc0 and project e10646c39f5c4532a2de09254970ffbd. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 610.693846] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] [instance: 024de920-7822-450d-93d2-a95bfdd02ff6] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 610.694358] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.694567] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.694734] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.704444] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] [instance: 024de920-7822-450d-93d2-a95bfdd02ff6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 610.704444] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.704444] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.704444] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.724849] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.725268] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.725588] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.765989] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=69,nova_cell1:UPDATE=21,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 610.839873] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=75,nova_cell1:UPDATE=20,nova_cell1:INSERT=3,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 610.961150] nova-conductor[52332]: DEBUG nova.network.neutron [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 035048f2-7780-40db-af06-f807b72d2cd8] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 610.964775] nova-conductor[52332]: DEBUG nova.network.neutron [None req-bfdb97b5-d7e0-4d9c-9bb0-7864bb8560ae tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 035048f2-7780-40db-af06-f807b72d2cd8] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.763148] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] [instance: 9e1ead6b-e86f-45d9-906d-b2c97138adff] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 853be577-1e61-4ce4-8699-a14e0e39e431, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 9e1ead6b-e86f-45d9-906d-b2c97138adff was re-scheduled: Binding failed for port 853be577-1e61-4ce4-8699-a14e0e39e431, please check neutron logs for more information.\n'] [ 612.763835] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 612.764686] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9e1ead6b-e86f-45d9-906d-b2c97138adff.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9e1ead6b-e86f-45d9-906d-b2c97138adff. [ 612.765218] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] [instance: 9e1ead6b-e86f-45d9-906d-b2c97138adff] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 9e1ead6b-e86f-45d9-906d-b2c97138adff. [ 612.796974] nova-conductor[52332]: DEBUG nova.network.neutron [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] [instance: 9e1ead6b-e86f-45d9-906d-b2c97138adff] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 613.039593] nova-conductor[52332]: DEBUG nova.network.neutron [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] [instance: 9e1ead6b-e86f-45d9-906d-b2c97138adff] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 613.044827] nova-conductor[52332]: DEBUG nova.network.neutron [None req-4aa3d473-96d4-4adf-9286-43531d256489 tempest-ServerTagsTestJSON-503976635 tempest-ServerTagsTestJSON-503976635-project-member] [instance: 9e1ead6b-e86f-45d9-906d-b2c97138adff] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.129752] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 613.142347] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.142629] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.142819] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.172309] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.172610] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.172705] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.173042] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.173258] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.173423] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.183044] nova-conductor[52331]: DEBUG nova.quota [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Getting quotas for project f31d6cd1c9a045beacfc60af31d1ffad. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 613.186249] nova-conductor[52331]: DEBUG nova.quota [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Getting quotas for user 32d3f462daeb4beb923d97ca93470591 and project f31d6cd1c9a045beacfc60af31d1ffad. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 613.193501] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 613.194029] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.194244] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.194409] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.197642] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 613.198327] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.198533] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.198702] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.213534] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.213850] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.214192] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.700741] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=71,nova_cell1:UPDATE=20,nova_cell1:INSERT=3,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 613.718200] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:UPDATE=15,nova_cell1:SELECT=71,nova_cell1:SAVEPOINT=6,nova_cell1:RELEASE=6,nova_cell1:INSERT=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 613.798940] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 613.848480] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.848480] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.848480] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.883500] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=30,nova_cell1:SAVEPOINT=4,nova_cell1:INSERT=62,nova_cell1:RELEASE=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 613.922206] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.922442] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.922591] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.923176] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.923721] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.923721] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.932435] nova-conductor[52332]: DEBUG nova.quota [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Getting quotas for project 37f86084782b42ca854a87a598d3c185. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 613.934837] nova-conductor[52332]: DEBUG nova.quota [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Getting quotas for user 75087c02293f4a0abd89dd5e2313f44f and project 37f86084782b42ca854a87a598d3c185. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 613.941257] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] [instance: 16ad02eb-ec64-475a-baeb-f995578b154d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 613.941711] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.941917] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.942160] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.944948] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] [instance: 16ad02eb-ec64-475a-baeb-f995578b154d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 613.945707] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.945827] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.945991] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.958036] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.958252] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.958499] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.140428] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=19,nova_cell1:SELECT=71,nova_cell1:INSERT=2,nova_cell1:SAVEPOINT=4,nova_cell1:RELEASE=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 615.174128] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] [instance: d7d47cc0-277b-4b8e-9cce-c5c9e4099205] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 2b563c5f-0c6b-4e3a-b609-88f7d8ad5c84, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance d7d47cc0-277b-4b8e-9cce-c5c9e4099205 was re-scheduled: Binding failed for port 2b563c5f-0c6b-4e3a-b609-88f7d8ad5c84, please check neutron logs for more information.\n'] [ 615.174712] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 615.174948] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d7d47cc0-277b-4b8e-9cce-c5c9e4099205.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d7d47cc0-277b-4b8e-9cce-c5c9e4099205. [ 615.175282] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] [instance: d7d47cc0-277b-4b8e-9cce-c5c9e4099205] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d7d47cc0-277b-4b8e-9cce-c5c9e4099205. [ 615.203341] nova-conductor[52332]: DEBUG nova.network.neutron [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] [instance: d7d47cc0-277b-4b8e-9cce-c5c9e4099205] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 615.347078] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:UPDATE=20,nova_cell1:SELECT=73,nova_cell1:INSERT=1,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 615.392104] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] [instance: e8b2688d-e97d-4992-86b9-aac79fd73403] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 729a5c6d-be7a-40a4-bb4a-4d2234e6ea14, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance e8b2688d-e97d-4992-86b9-aac79fd73403 was re-scheduled: Binding failed for port 729a5c6d-be7a-40a4-bb4a-4d2234e6ea14, please check neutron logs for more information.\n'] [ 615.392674] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 615.394666] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e8b2688d-e97d-4992-86b9-aac79fd73403.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e8b2688d-e97d-4992-86b9-aac79fd73403. [ 615.394666] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] [instance: e8b2688d-e97d-4992-86b9-aac79fd73403] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e8b2688d-e97d-4992-86b9-aac79fd73403. [ 615.415237] nova-conductor[52332]: DEBUG nova.network.neutron [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] [instance: e8b2688d-e97d-4992-86b9-aac79fd73403] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 615.577554] nova-conductor[52332]: DEBUG nova.network.neutron [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] [instance: d7d47cc0-277b-4b8e-9cce-c5c9e4099205] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 615.581513] nova-conductor[52332]: DEBUG nova.network.neutron [None req-67691a58-bf0e-4008-a0ab-ded742c0dcca tempest-ServerDiagnosticsNegativeTest-1130413716 tempest-ServerDiagnosticsNegativeTest-1130413716-project-member] [instance: d7d47cc0-277b-4b8e-9cce-c5c9e4099205] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.716240] nova-conductor[52332]: DEBUG nova.network.neutron [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] [instance: e8b2688d-e97d-4992-86b9-aac79fd73403] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 615.719820] nova-conductor[52332]: DEBUG nova.network.neutron [None req-0d5a30b6-7724-4afb-9584-a69ce607b150 tempest-ImagesOneServerNegativeTestJSON-860032753 tempest-ImagesOneServerNegativeTestJSON-860032753-project-member] [instance: e8b2688d-e97d-4992-86b9-aac79fd73403] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 616.635538] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] [instance: 5a5ada63-8ae0-40e9-96b1-d60b040ec42b] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 3b31fbd5-84f0-4eba-afc6-596ba608ea44, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 5a5ada63-8ae0-40e9-96b1-d60b040ec42b was re-scheduled: Binding failed for port 3b31fbd5-84f0-4eba-afc6-596ba608ea44, please check neutron logs for more information.\n'] [ 616.635538] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 616.635538] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5a5ada63-8ae0-40e9-96b1-d60b040ec42b.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5a5ada63-8ae0-40e9-96b1-d60b040ec42b. [ 616.635980] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] [instance: 5a5ada63-8ae0-40e9-96b1-d60b040ec42b] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5a5ada63-8ae0-40e9-96b1-d60b040ec42b. [ 616.743274] nova-conductor[52331]: DEBUG nova.network.neutron [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] [instance: 5a5ada63-8ae0-40e9-96b1-d60b040ec42b] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 617.834495] nova-conductor[52331]: DEBUG nova.network.neutron [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] [instance: 5a5ada63-8ae0-40e9-96b1-d60b040ec42b] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 617.839553] nova-conductor[52331]: DEBUG nova.network.neutron [None req-3d9b90f0-0418-47e5-bf2e-fb2ad21d72b7 tempest-ImagesNegativeTestJSON-2094128861 tempest-ImagesNegativeTestJSON-2094128861-project-member] [instance: 5a5ada63-8ae0-40e9-96b1-d60b040ec42b] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 618.416903] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 618.435538] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.435708] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.435872] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.444586] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=33,nova_cell1:SAVEPOINT=3,nova_cell1:INSERT=61,nova_cell1:RELEASE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 618.488151] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.488151] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.488358] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.488768] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.489023] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.489239] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.503778] nova-conductor[52331]: DEBUG nova.quota [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Getting quotas for project d6fbe339521f48ef80dfe3dd94ae1046. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 618.507039] nova-conductor[52331]: DEBUG nova.quota [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Getting quotas for user 0049527d28e746b5a1317b9f14e2e18a and project d6fbe339521f48ef80dfe3dd94ae1046. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 618.513708] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 150e44e7-7f27-45d5-8ddf-67da74739e4d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 618.514304] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.514609] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.514989] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.523118] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 150e44e7-7f27-45d5-8ddf-67da74739e4d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 618.524096] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.524411] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.524637] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.546112] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.546452] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.546693] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.100879] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=64,nova_cell1:SAVEPOINT=5,nova_cell1:RELEASE=5,nova_cell1:UPDATE=21,nova_cell1:INSERT=5 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 622.042266] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=92,nova_api:DELETE=3,nova_api:UPDATE=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 622.044331] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=78,nova_api:UPDATE=3,nova_api:DELETE=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 622.116291] nova-conductor[52332]: Traceback (most recent call last): [ 622.116291] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 622.116291] nova-conductor[52332]: return func(*args, **kwargs) [ 622.116291] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 622.116291] nova-conductor[52332]: selections = self._select_destinations( [ 622.116291] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 622.116291] nova-conductor[52332]: selections = self._schedule( [ 622.116291] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 622.116291] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 622.116291] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 622.116291] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 622.116291] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.116291] nova-conductor[52332]: ERROR nova.conductor.manager [ 622.126060] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.126319] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.126454] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.224126] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] [instance: c1e16c47-0cd0-4775-adbd-6621ab31156e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 622.224864] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.225245] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.225353] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.235239] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 622.235239] nova-conductor[52332]: Traceback (most recent call last): [ 622.235239] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 622.235239] nova-conductor[52332]: return func(*args, **kwargs) [ 622.235239] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 622.235239] nova-conductor[52332]: selections = self._select_destinations( [ 622.235239] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 622.235239] nova-conductor[52332]: selections = self._schedule( [ 622.235239] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 622.235239] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 622.235239] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 622.235239] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 622.235239] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 622.235239] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 622.235827] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-ad8931fb-d642-41d4-9034-a3f32ea1e450 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] [instance: c1e16c47-0cd0-4775-adbd-6621ab31156e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.464671] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] [instance: 1d44d43a-86a5-4415-bcdd-afd2790adaee] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 0b0c24dd-44ec-4971-9c7a-2edb12d7ad84, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 1d44d43a-86a5-4415-bcdd-afd2790adaee was re-scheduled: Binding failed for port 0b0c24dd-44ec-4971-9c7a-2edb12d7ad84, please check neutron logs for more information.\n'] [ 623.467300] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 623.467300] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1d44d43a-86a5-4415-bcdd-afd2790adaee.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1d44d43a-86a5-4415-bcdd-afd2790adaee. [ 623.467300] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] [instance: 1d44d43a-86a5-4415-bcdd-afd2790adaee] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1d44d43a-86a5-4415-bcdd-afd2790adaee. [ 623.485158] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:UPDATE=21,nova_cell1:SELECT=69,nova_cell1:INSERT=2,nova_cell1:SAVEPOINT=4,nova_cell1:RELEASE=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.489953] nova-conductor[52332]: Traceback (most recent call last): [ 623.489953] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.489953] nova-conductor[52332]: return func(*args, **kwargs) [ 623.489953] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.489953] nova-conductor[52332]: selections = self._select_destinations( [ 623.489953] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.489953] nova-conductor[52332]: selections = self._schedule( [ 623.489953] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.489953] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 623.489953] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.489953] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 623.489953] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.489953] nova-conductor[52332]: ERROR nova.conductor.manager [ 623.497861] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.498276] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.498696] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.500538] nova-conductor[52331]: DEBUG nova.network.neutron [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] [instance: 1d44d43a-86a5-4415-bcdd-afd2790adaee] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 623.553166] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] [instance: bcfc1579-f7c2-4539-8282-c48a90f6a634] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 623.553555] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.553766] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.553935] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.555554] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=46,nova_cell0:SAVEPOINT=2,nova_cell0:INSERT=43,nova_cell0:RELEASE=2,nova_cell0:UPDATE=7 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 623.562660] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 623.562660] nova-conductor[52332]: Traceback (most recent call last): [ 623.562660] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.562660] nova-conductor[52332]: return func(*args, **kwargs) [ 623.562660] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.562660] nova-conductor[52332]: selections = self._select_destinations( [ 623.562660] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.562660] nova-conductor[52332]: selections = self._schedule( [ 623.562660] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.562660] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 623.562660] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.562660] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 623.562660] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.562660] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.563546] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-68705052-a2a1-45ea-a3e1-6047f8124035 tempest-ServerExternalEventsTest-1801891655 tempest-ServerExternalEventsTest-1801891655-project-member] [instance: bcfc1579-f7c2-4539-8282-c48a90f6a634] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.585503] nova-conductor[52331]: DEBUG nova.network.neutron [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] [instance: 1d44d43a-86a5-4415-bcdd-afd2790adaee] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 623.593056] nova-conductor[52331]: DEBUG nova.network.neutron [None req-234b7c78-0994-4f70-82d0-96e8e34930b5 tempest-FloatingIPsAssociationTestJSON-345149790 tempest-FloatingIPsAssociationTestJSON-345149790-project-member] [instance: 1d44d43a-86a5-4415-bcdd-afd2790adaee] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.704350] nova-conductor[52331]: Traceback (most recent call last): [ 623.704350] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.704350] nova-conductor[52331]: return func(*args, **kwargs) [ 623.704350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.704350] nova-conductor[52331]: selections = self._select_destinations( [ 623.704350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.704350] nova-conductor[52331]: selections = self._schedule( [ 623.704350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.704350] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 623.704350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.704350] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 623.704350] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.704350] nova-conductor[52331]: ERROR nova.conductor.manager [ 623.714932] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.715156] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.715355] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.787355] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] [instance: 4cf43167-b4ad-44ed-8203-7da85d7b2a3d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 623.788435] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.789033] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.789460] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.799714] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 623.799714] nova-conductor[52331]: Traceback (most recent call last): [ 623.799714] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 623.799714] nova-conductor[52331]: return func(*args, **kwargs) [ 623.799714] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 623.799714] nova-conductor[52331]: selections = self._select_destinations( [ 623.799714] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 623.799714] nova-conductor[52331]: selections = self._schedule( [ 623.799714] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 623.799714] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 623.799714] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 623.799714] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 623.799714] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 623.799714] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.802052] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-36a1ff23-367f-4677-b301-fc141c3d7f97 tempest-ListImageFiltersTestJSON-1817970357 tempest-ListImageFiltersTestJSON-1817970357-project-member] [instance: 4cf43167-b4ad-44ed-8203-7da85d7b2a3d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 623.950269] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:INSERT=18,nova_cell1:SELECT=6 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.355051] nova-conductor[52331]: Traceback (most recent call last): [ 628.355051] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 628.355051] nova-conductor[52331]: return func(*args, **kwargs) [ 628.355051] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 628.355051] nova-conductor[52331]: selections = self._select_destinations( [ 628.355051] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 628.355051] nova-conductor[52331]: selections = self._schedule( [ 628.355051] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 628.355051] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 628.355051] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 628.355051] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 628.355051] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.355051] nova-conductor[52331]: ERROR nova.conductor.manager [ 628.368346] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.369830] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.002s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.370156] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.469337] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port f6d1391e-35bb-4cc0-958a-aea522a7bfbd, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c was re-scheduled: Binding failed for port f6d1391e-35bb-4cc0-958a-aea522a7bfbd, please check neutron logs for more information.\n'] [ 628.472062] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 628.472459] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c. [ 628.473652] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c. [ 628.491569] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] [instance: 45052551-847c-4f58-9429-351b8eba2536] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 628.491569] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.491569] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.492062] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.499578] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 628.499578] nova-conductor[52331]: Traceback (most recent call last): [ 628.499578] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 628.499578] nova-conductor[52331]: return func(*args, **kwargs) [ 628.499578] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 628.499578] nova-conductor[52331]: selections = self._select_destinations( [ 628.499578] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 628.499578] nova-conductor[52331]: selections = self._schedule( [ 628.499578] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 628.499578] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 628.499578] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 628.499578] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 628.499578] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 628.499578] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.500055] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-225ece33-f4af-4bc3-a34a-31dfbbd63d3e tempest-ServersTestManualDisk-1092728717 tempest-ServersTestManualDisk-1092728717-project-member] [instance: 45052551-847c-4f58-9429-351b8eba2536] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 628.511554] nova-conductor[52332]: DEBUG nova.network.neutron [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 628.529853] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:INSERT=18,nova_cell1:SELECT=6 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 628.710200] nova-conductor[52332]: DEBUG nova.network.neutron [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 628.714507] nova-conductor[52332]: DEBUG nova.network.neutron [None req-81781ab6-33ba-48ed-9284-0f5250082793 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3c98bffb-a7be-47bc-9ce2-610b6ce3e05c] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 630.956327] nova-conductor[52331]: Traceback (most recent call last): [ 630.956327] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 630.956327] nova-conductor[52331]: return func(*args, **kwargs) [ 630.956327] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 630.956327] nova-conductor[52331]: selections = self._select_destinations( [ 630.956327] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 630.956327] nova-conductor[52331]: selections = self._schedule( [ 630.956327] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 630.956327] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 630.956327] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 630.956327] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 630.956327] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.956327] nova-conductor[52331]: ERROR nova.conductor.manager [ 630.965700] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.965700] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.965872] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.968013] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=46,nova_cell0:INSERT=44,nova_cell0:UPDATE=8,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 631.014510] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] [instance: bd62e8f7-4652-46af-a312-09393fb5f5de] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.015455] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.015816] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.016158] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.020080] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 631.020080] nova-conductor[52331]: Traceback (most recent call last): [ 631.020080] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.020080] nova-conductor[52331]: return func(*args, **kwargs) [ 631.020080] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.020080] nova-conductor[52331]: selections = self._select_destinations( [ 631.020080] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.020080] nova-conductor[52331]: selections = self._schedule( [ 631.020080] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.020080] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 631.020080] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.020080] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 631.020080] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.020080] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.021343] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-23230d66-1bf8-448f-b87f-3370694ae931 tempest-ServersWithSpecificFlavorTestJSON-1259060906 tempest-ServersWithSpecificFlavorTestJSON-1259060906-project-member] [instance: bd62e8f7-4652-46af-a312-09393fb5f5de] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.540733] nova-conductor[52332]: Traceback (most recent call last): [ 631.540733] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.540733] nova-conductor[52332]: return func(*args, **kwargs) [ 631.540733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.540733] nova-conductor[52332]: selections = self._select_destinations( [ 631.540733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.540733] nova-conductor[52332]: selections = self._schedule( [ 631.540733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.540733] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 631.540733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.540733] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 631.540733] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.540733] nova-conductor[52332]: ERROR nova.conductor.manager [ 631.547958] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.552225] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.002s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.552225] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.593381] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] [instance: fdb8af7a-872e-4299-8ddd-ef09d67f3e79] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.594216] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.594467] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.594672] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.597916] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 631.597916] nova-conductor[52332]: Traceback (most recent call last): [ 631.597916] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.597916] nova-conductor[52332]: return func(*args, **kwargs) [ 631.597916] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.597916] nova-conductor[52332]: selections = self._select_destinations( [ 631.597916] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.597916] nova-conductor[52332]: selections = self._schedule( [ 631.597916] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.597916] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 631.597916] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.597916] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 631.597916] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.597916] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.598730] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] [instance: fdb8af7a-872e-4299-8ddd-ef09d67f3e79] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.626133] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.626418] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.626732] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.667684] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] [instance: 0eddaf34-864c-474c-8d9a-5c068f5e02ca] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.668283] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.668550] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.668755] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.673733] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 631.673733] nova-conductor[52332]: Traceback (most recent call last): [ 631.673733] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.673733] nova-conductor[52332]: return func(*args, **kwargs) [ 631.673733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.673733] nova-conductor[52332]: selections = self._select_destinations( [ 631.673733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.673733] nova-conductor[52332]: selections = self._schedule( [ 631.673733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.673733] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 631.673733] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.673733] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 631.673733] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.673733] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.673733] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] [instance: 0eddaf34-864c-474c-8d9a-5c068f5e02ca] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.701671] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.702045] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.702233] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.717149] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=42,nova_cell0:UPDATE=9,nova_cell0:INSERT=47,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 631.745113] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] [instance: 681af6a7-5b12-4cbd-8524-b7e153d967f8] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 631.745840] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.746057] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.746229] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.749760] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 631.749760] nova-conductor[52332]: Traceback (most recent call last): [ 631.749760] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 631.749760] nova-conductor[52332]: return func(*args, **kwargs) [ 631.749760] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 631.749760] nova-conductor[52332]: selections = self._select_destinations( [ 631.749760] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 631.749760] nova-conductor[52332]: selections = self._schedule( [ 631.749760] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 631.749760] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 631.749760] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 631.749760] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 631.749760] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 631.749760] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 631.750293] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-52ade7c5-8e22-4097-a0b7-e0d4e1962118 tempest-ListServersNegativeTestJSON-151456438 tempest-ListServersNegativeTestJSON-151456438-project-member] [instance: 681af6a7-5b12-4cbd-8524-b7e153d967f8] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 632.983260] nova-conductor[52332]: Traceback (most recent call last): [ 632.983260] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 632.983260] nova-conductor[52332]: return func(*args, **kwargs) [ 632.983260] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 632.983260] nova-conductor[52332]: selections = self._select_destinations( [ 632.983260] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 632.983260] nova-conductor[52332]: selections = self._schedule( [ 632.983260] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 632.983260] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 632.983260] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 632.983260] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 632.983260] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.983260] nova-conductor[52332]: ERROR nova.conductor.manager [ 632.990486] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.990791] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.991023] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.049779] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] [instance: bf82ede3-cdbc-4805-afed-0f465a2ec1a6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 633.050514] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.050733] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.050912] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.054279] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 633.054279] nova-conductor[52332]: Traceback (most recent call last): [ 633.054279] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 633.054279] nova-conductor[52332]: return func(*args, **kwargs) [ 633.054279] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 633.054279] nova-conductor[52332]: selections = self._select_destinations( [ 633.054279] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 633.054279] nova-conductor[52332]: selections = self._schedule( [ 633.054279] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 633.054279] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 633.054279] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 633.054279] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 633.054279] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 633.054279] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 633.054830] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-2482167b-a03e-47d7-8744-a1c01a073d75 tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] [instance: bf82ede3-cdbc-4805-afed-0f465a2ec1a6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 633.323605] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=67,nova_cell1:SAVEPOINT=4,nova_cell1:RELEASE=4,nova_cell1:UPDATE=22,nova_cell1:INSERT=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 633.404271] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 150e44e7-7f27-45d5-8ddf-67da74739e4d] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 802b5927-2f55-4ccb-ab31-bf79a83c9442, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 150e44e7-7f27-45d5-8ddf-67da74739e4d was re-scheduled: Binding failed for port 802b5927-2f55-4ccb-ab31-bf79a83c9442, please check neutron logs for more information.\n'] [ 633.404829] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 633.405573] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 150e44e7-7f27-45d5-8ddf-67da74739e4d.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 150e44e7-7f27-45d5-8ddf-67da74739e4d. [ 633.405835] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 150e44e7-7f27-45d5-8ddf-67da74739e4d] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 150e44e7-7f27-45d5-8ddf-67da74739e4d. [ 633.432110] nova-conductor[52332]: DEBUG nova.network.neutron [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 150e44e7-7f27-45d5-8ddf-67da74739e4d] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 633.557913] nova-conductor[52332]: DEBUG nova.network.neutron [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 150e44e7-7f27-45d5-8ddf-67da74739e4d] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 633.562106] nova-conductor[52332]: DEBUG nova.network.neutron [None req-d3759443-b2ac-4ddf-bde6-c8c2d8ae86de tempest-DeleteServersAdminTestJSON-1696381296 tempest-DeleteServersAdminTestJSON-1696381296-project-member] [instance: 150e44e7-7f27-45d5-8ddf-67da74739e4d] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.128124] nova-conductor[52332]: Traceback (most recent call last): [ 635.128124] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.128124] nova-conductor[52332]: return func(*args, **kwargs) [ 635.128124] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.128124] nova-conductor[52332]: selections = self._select_destinations( [ 635.128124] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.128124] nova-conductor[52332]: selections = self._schedule( [ 635.128124] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.128124] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 635.128124] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.128124] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 635.128124] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.128124] nova-conductor[52332]: ERROR nova.conductor.manager [ 635.131410] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.131719] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.131975] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.149737] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:INSERT=54,nova_cell0:SELECT=34,nova_cell0:UPDATE=8,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 635.196311] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] [instance: 59a3ec4c-b2c0-42ac-9f05-f780592dfb34] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 635.197024] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.197231] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.197660] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.204844] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 635.204844] nova-conductor[52332]: Traceback (most recent call last): [ 635.204844] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.204844] nova-conductor[52332]: return func(*args, **kwargs) [ 635.204844] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.204844] nova-conductor[52332]: selections = self._select_destinations( [ 635.204844] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.204844] nova-conductor[52332]: selections = self._schedule( [ 635.204844] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.204844] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 635.204844] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.204844] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 635.204844] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.204844] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.205418] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-8e2e6dd8-757f-4ac7-aac8-acac6f0d2687 tempest-ServerAddressesTestJSON-987249436 tempest-ServerAddressesTestJSON-987249436-project-member] [instance: 59a3ec4c-b2c0-42ac-9f05-f780592dfb34] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.778637] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=59,nova_api:UPDATE=2,nova_api:DELETE=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 635.779509] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=68,nova_api:DELETE=3,nova_api:UPDATE=5 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.865524] nova-conductor[52331]: Traceback (most recent call last): [ 635.865524] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.865524] nova-conductor[52331]: return func(*args, **kwargs) [ 635.865524] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.865524] nova-conductor[52331]: selections = self._select_destinations( [ 635.865524] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.865524] nova-conductor[52331]: selections = self._schedule( [ 635.865524] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.865524] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 635.865524] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.865524] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 635.865524] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.865524] nova-conductor[52331]: ERROR nova.conductor.manager [ 635.873556] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.873723] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.874677] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.957079] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] [instance: f0424b3f-75e3-4a95-9bab-5b37a230a169] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 635.957819] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.958038] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.958219] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.961557] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 635.961557] nova-conductor[52331]: Traceback (most recent call last): [ 635.961557] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 635.961557] nova-conductor[52331]: return func(*args, **kwargs) [ 635.961557] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 635.961557] nova-conductor[52331]: selections = self._select_destinations( [ 635.961557] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 635.961557] nova-conductor[52331]: selections = self._schedule( [ 635.961557] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 635.961557] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 635.961557] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 635.961557] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 635.961557] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 635.961557] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 635.962354] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-56c73da2-a2f1-46b7-bf6a-1becc28e95b4 tempest-TenantUsagesTestJSON-1354869813 tempest-TenantUsagesTestJSON-1354869813-project-member] [instance: f0424b3f-75e3-4a95-9bab-5b37a230a169] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.217447] nova-conductor[52332]: Traceback (most recent call last): [ 639.217447] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 639.217447] nova-conductor[52332]: return func(*args, **kwargs) [ 639.217447] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 639.217447] nova-conductor[52332]: selections = self._select_destinations( [ 639.217447] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 639.217447] nova-conductor[52332]: selections = self._schedule( [ 639.217447] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 639.217447] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 639.217447] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 639.217447] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 639.217447] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.217447] nova-conductor[52332]: ERROR nova.conductor.manager [ 639.225837] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.226187] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.226535] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.294177] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] [instance: b0d088b8-0e65-4d7f-91a2-41db1c104f63] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 639.294935] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.295159] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.295328] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.298966] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 639.298966] nova-conductor[52332]: Traceback (most recent call last): [ 639.298966] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 639.298966] nova-conductor[52332]: return func(*args, **kwargs) [ 639.298966] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 639.298966] nova-conductor[52332]: selections = self._select_destinations( [ 639.298966] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 639.298966] nova-conductor[52332]: selections = self._schedule( [ 639.298966] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 639.298966] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 639.298966] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 639.298966] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 639.298966] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 639.298966] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 639.299489] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-68d84c4a-b6d0-47ab-a4b8-aeda2498808c tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] [instance: b0d088b8-0e65-4d7f-91a2-41db1c104f63] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.601597] nova-conductor[52331]: Traceback (most recent call last): [ 640.601597] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.601597] nova-conductor[52331]: return func(*args, **kwargs) [ 640.601597] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.601597] nova-conductor[52331]: selections = self._select_destinations( [ 640.601597] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.601597] nova-conductor[52331]: selections = self._schedule( [ 640.601597] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.601597] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 640.601597] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.601597] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 640.601597] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.601597] nova-conductor[52331]: ERROR nova.conductor.manager [ 640.610832] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.610832] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.610993] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.634564] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=36,nova_cell0:SAVEPOINT=3,nova_cell0:INSERT=50,nova_cell0:RELEASE=3,nova_cell0:UPDATE=8 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 640.682714] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 0cfbd866-bd14-4efd-8bd7-e56c023b0051] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 640.683430] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.683691] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.683865] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.697294] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 640.697294] nova-conductor[52331]: Traceback (most recent call last): [ 640.697294] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.697294] nova-conductor[52331]: return func(*args, **kwargs) [ 640.697294] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.697294] nova-conductor[52331]: selections = self._select_destinations( [ 640.697294] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.697294] nova-conductor[52331]: selections = self._schedule( [ 640.697294] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.697294] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 640.697294] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.697294] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 640.697294] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.697294] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.697809] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8f991097-6df0-4c1e-acd9-15932ca9bc89 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 0cfbd866-bd14-4efd-8bd7-e56c023b0051] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.755034] nova-conductor[52332]: Traceback (most recent call last): [ 640.755034] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.755034] nova-conductor[52332]: return func(*args, **kwargs) [ 640.755034] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.755034] nova-conductor[52332]: selections = self._select_destinations( [ 640.755034] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.755034] nova-conductor[52332]: selections = self._schedule( [ 640.755034] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.755034] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 640.755034] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.755034] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 640.755034] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.755034] nova-conductor[52332]: ERROR nova.conductor.manager [ 640.763348] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.763583] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.763749] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.811605] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:INSERT=49,nova_cell0:SELECT=39,nova_cell0:UPDATE=8,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 640.833210] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] [instance: c1893051-ddf9-4df6-988a-a880b7899b93] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 640.833923] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.834240] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.834333] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.837810] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 640.837810] nova-conductor[52332]: Traceback (most recent call last): [ 640.837810] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 640.837810] nova-conductor[52332]: return func(*args, **kwargs) [ 640.837810] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 640.837810] nova-conductor[52332]: selections = self._select_destinations( [ 640.837810] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 640.837810] nova-conductor[52332]: selections = self._schedule( [ 640.837810] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 640.837810] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 640.837810] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 640.837810] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 640.837810] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 640.837810] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 640.838429] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-cf585a15-a3b0-438b-aa26-2c380cc7d073 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099 tempest-FloatingIPsAssociationNegativeTestJSON-2141388099-project-member] [instance: c1893051-ddf9-4df6-988a-a880b7899b93] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.168260] nova-conductor[52331]: Traceback (most recent call last): [ 642.168260] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.168260] nova-conductor[52331]: return func(*args, **kwargs) [ 642.168260] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.168260] nova-conductor[52331]: selections = self._select_destinations( [ 642.168260] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.168260] nova-conductor[52331]: selections = self._schedule( [ 642.168260] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.168260] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 642.168260] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.168260] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 642.168260] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.168260] nova-conductor[52331]: ERROR nova.conductor.manager [ 642.175150] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.175304] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.175471] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.230732] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: c2ac0db1-967f-4082-a621-1d8be60548ff] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 642.231772] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.231772] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.231916] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.235887] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 642.235887] nova-conductor[52331]: Traceback (most recent call last): [ 642.235887] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 642.235887] nova-conductor[52331]: return func(*args, **kwargs) [ 642.235887] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 642.235887] nova-conductor[52331]: selections = self._select_destinations( [ 642.235887] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 642.235887] nova-conductor[52331]: selections = self._schedule( [ 642.235887] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 642.235887] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 642.235887] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 642.235887] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 642.235887] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 642.235887] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 642.236920] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-f853cb9c-d764-46e0-b364-0ad64ba7aed9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: c2ac0db1-967f-4082-a621-1d8be60548ff] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.101815] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=51,nova_cell1:UPDATE=13,nova_cell1:INSERT=1,nova_cell1:SAVEPOINT=8,nova_cell1:RELEASE=8 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 644.158050] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=12,nova_cell1:INSERT=2,nova_cell1:UPDATE=3,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.705502] nova-conductor[52332]: Traceback (most recent call last): [ 644.705502] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.705502] nova-conductor[52332]: return func(*args, **kwargs) [ 644.705502] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.705502] nova-conductor[52332]: selections = self._select_destinations( [ 644.705502] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.705502] nova-conductor[52332]: selections = self._schedule( [ 644.705502] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.705502] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 644.705502] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.705502] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 644.705502] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.705502] nova-conductor[52332]: ERROR nova.conductor.manager [ 644.724587] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.724826] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.725041] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.782228] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] [instance: 2a268f4f-2524-473e-ac70-c00b50221f2f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 644.783041] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.783263] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.783466] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.789425] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 644.789425] nova-conductor[52332]: Traceback (most recent call last): [ 644.789425] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 644.789425] nova-conductor[52332]: return func(*args, **kwargs) [ 644.789425] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 644.789425] nova-conductor[52332]: selections = self._select_destinations( [ 644.789425] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 644.789425] nova-conductor[52332]: selections = self._schedule( [ 644.789425] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 644.789425] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 644.789425] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 644.789425] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 644.789425] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 644.789425] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 644.789958] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-87fa0e78-7944-4edb-b8a5-6ed5d26514c8 tempest-ServerActionsTestOtherA-1414460626 tempest-ServerActionsTestOtherA-1414460626-project-member] [instance: 2a268f4f-2524-473e-ac70-c00b50221f2f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.075933] nova-conductor[52331]: Traceback (most recent call last): [ 646.075933] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.075933] nova-conductor[52331]: return func(*args, **kwargs) [ 646.075933] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.075933] nova-conductor[52331]: selections = self._select_destinations( [ 646.075933] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.075933] nova-conductor[52331]: selections = self._schedule( [ 646.075933] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.075933] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 646.075933] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.075933] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 646.075933] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.075933] nova-conductor[52331]: ERROR nova.conductor.manager [ 646.088048] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.088938] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.089356] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.117032] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:INSERT=54,nova_cell0:SELECT=34,nova_cell0:UPDATE=8,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 646.162906] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] [instance: 1c4832fa-c070-4ef3-9b18-a49cc12184f1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 646.163823] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.163957] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.164203] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.169539] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 646.169539] nova-conductor[52331]: Traceback (most recent call last): [ 646.169539] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.169539] nova-conductor[52331]: return func(*args, **kwargs) [ 646.169539] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.169539] nova-conductor[52331]: selections = self._select_destinations( [ 646.169539] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.169539] nova-conductor[52331]: selections = self._schedule( [ 646.169539] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.169539] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 646.169539] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.169539] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 646.169539] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.169539] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.170073] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-15768dae-7b61-48d0-a647-f20a8aaa498a tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] [instance: 1c4832fa-c070-4ef3-9b18-a49cc12184f1] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.876720] nova-conductor[52332]: Traceback (most recent call last): [ 646.876720] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.876720] nova-conductor[52332]: return func(*args, **kwargs) [ 646.876720] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.876720] nova-conductor[52332]: selections = self._select_destinations( [ 646.876720] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.876720] nova-conductor[52332]: selections = self._schedule( [ 646.876720] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.876720] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 646.876720] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.876720] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 646.876720] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.876720] nova-conductor[52332]: ERROR nova.conductor.manager [ 646.882948] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.883180] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.883742] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.939731] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: c16daa09-c0cc-4cc1-ba76-21154c7fcb36] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 646.940403] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.940616] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.940786] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.943877] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 646.943877] nova-conductor[52332]: Traceback (most recent call last): [ 646.943877] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 646.943877] nova-conductor[52332]: return func(*args, **kwargs) [ 646.943877] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 646.943877] nova-conductor[52332]: selections = self._select_destinations( [ 646.943877] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 646.943877] nova-conductor[52332]: selections = self._schedule( [ 646.943877] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 646.943877] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 646.943877] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 646.943877] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 646.943877] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 646.943877] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.944540] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-140b230b-ba68-40e1-be56-98d03edd5ba8 tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: c16daa09-c0cc-4cc1-ba76-21154c7fcb36] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 646.946623] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:UPDATE=11,nova_cell0:INSERT=46,nova_cell0:SELECT=39,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.109636] nova-conductor[52331]: Traceback (most recent call last): [ 648.109636] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.109636] nova-conductor[52331]: return func(*args, **kwargs) [ 648.109636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.109636] nova-conductor[52331]: selections = self._select_destinations( [ 648.109636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.109636] nova-conductor[52331]: selections = self._schedule( [ 648.109636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.109636] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 648.109636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.109636] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 648.109636] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.109636] nova-conductor[52331]: ERROR nova.conductor.manager [ 648.117958] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.118255] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.118433] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.170407] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] [instance: a7ebac46-0a5a-4044-a485-6686e5528f9b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 648.171126] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.171354] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.171530] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.175013] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 648.175013] nova-conductor[52331]: Traceback (most recent call last): [ 648.175013] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.175013] nova-conductor[52331]: return func(*args, **kwargs) [ 648.175013] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.175013] nova-conductor[52331]: selections = self._select_destinations( [ 648.175013] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.175013] nova-conductor[52331]: selections = self._schedule( [ 648.175013] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.175013] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 648.175013] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.175013] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 648.175013] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.175013] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.175601] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-3143eac6-ca76-4f38-b8f1-0437b211f5c5 tempest-AttachInterfacesV270Test-1149829814 tempest-AttachInterfacesV270Test-1149829814-project-member] [instance: a7ebac46-0a5a-4044-a485-6686e5528f9b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.405039] nova-conductor[52332]: Traceback (most recent call last): [ 648.405039] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.405039] nova-conductor[52332]: return func(*args, **kwargs) [ 648.405039] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.405039] nova-conductor[52332]: selections = self._select_destinations( [ 648.405039] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.405039] nova-conductor[52332]: selections = self._schedule( [ 648.405039] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.405039] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 648.405039] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.405039] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 648.405039] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.405039] nova-conductor[52332]: ERROR nova.conductor.manager [ 648.412064] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.412305] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.412474] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.467185] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] [instance: 41ae650f-8940-4efb-ad90-ada47caf8d7e] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 648.467898] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.468678] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.468678] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 648.471115] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 648.471115] nova-conductor[52332]: Traceback (most recent call last): [ 648.471115] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 648.471115] nova-conductor[52332]: return func(*args, **kwargs) [ 648.471115] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 648.471115] nova-conductor[52332]: selections = self._select_destinations( [ 648.471115] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 648.471115] nova-conductor[52332]: selections = self._schedule( [ 648.471115] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 648.471115] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 648.471115] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 648.471115] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 648.471115] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 648.471115] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 648.472099] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-77abd1a3-91b2-4c83-89b7-ca7ff94c3a8d tempest-ServersAdminNegativeTestJSON-1572951032 tempest-ServersAdminNegativeTestJSON-1572951032-project-member] [instance: 41ae650f-8940-4efb-ad90-ada47caf8d7e] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.368833] nova-conductor[52331]: Traceback (most recent call last): [ 649.368833] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 649.368833] nova-conductor[52331]: return func(*args, **kwargs) [ 649.368833] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 649.368833] nova-conductor[52331]: selections = self._select_destinations( [ 649.368833] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 649.368833] nova-conductor[52331]: selections = self._schedule( [ 649.368833] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 649.368833] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 649.368833] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 649.368833] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 649.368833] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.368833] nova-conductor[52331]: ERROR nova.conductor.manager [ 649.374419] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 649.374646] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 649.374814] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 649.419529] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:INSERT=48,nova_cell0:SELECT=39,nova_cell0:UPDATE=9,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 649.428474] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] [instance: 470cb89d-9422-4260-9fe4-6ca88a76fc86] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 649.429804] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 649.430144] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 649.430470] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 649.434019] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 649.434019] nova-conductor[52331]: Traceback (most recent call last): [ 649.434019] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 649.434019] nova-conductor[52331]: return func(*args, **kwargs) [ 649.434019] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 649.434019] nova-conductor[52331]: selections = self._select_destinations( [ 649.434019] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 649.434019] nova-conductor[52331]: selections = self._schedule( [ 649.434019] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 649.434019] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 649.434019] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 649.434019] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 649.434019] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 649.434019] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 649.435534] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8a682f9c-efe5-4414-8af2-8b81ecf9d9f2 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] [instance: 470cb89d-9422-4260-9fe4-6ca88a76fc86] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.071163] nova-conductor[52332]: Traceback (most recent call last): [ 650.071163] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.071163] nova-conductor[52332]: return func(*args, **kwargs) [ 650.071163] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.071163] nova-conductor[52332]: selections = self._select_destinations( [ 650.071163] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.071163] nova-conductor[52332]: selections = self._schedule( [ 650.071163] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.071163] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 650.071163] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.071163] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 650.071163] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.071163] nova-conductor[52332]: ERROR nova.conductor.manager [ 650.079946] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.080180] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.080373] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.120207] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] [instance: f31cc53b-df27-47aa-85f1-4ff66e66ca8f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 650.121072] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.121348] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.121565] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.126314] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 650.126314] nova-conductor[52332]: Traceback (most recent call last): [ 650.126314] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.126314] nova-conductor[52332]: return func(*args, **kwargs) [ 650.126314] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.126314] nova-conductor[52332]: selections = self._select_destinations( [ 650.126314] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.126314] nova-conductor[52332]: selections = self._schedule( [ 650.126314] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.126314] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 650.126314] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.126314] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 650.126314] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.126314] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.126923] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-87c479f5-c9bf-4dac-918f-6f25441c4923 tempest-ServersAdminTestJSON-1804072097 tempest-ServersAdminTestJSON-1804072097-project-member] [instance: f31cc53b-df27-47aa-85f1-4ff66e66ca8f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.853462] nova-conductor[52331]: Traceback (most recent call last): [ 650.853462] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.853462] nova-conductor[52331]: return func(*args, **kwargs) [ 650.853462] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.853462] nova-conductor[52331]: selections = self._select_destinations( [ 650.853462] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.853462] nova-conductor[52331]: selections = self._schedule( [ 650.853462] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.853462] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 650.853462] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.853462] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 650.853462] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.853462] nova-conductor[52331]: ERROR nova.conductor.manager [ 650.861817] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.862113] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.862211] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.907568] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] [instance: b631976e-6c45-4b88-a428-4bd121929f79] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 650.908500] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 650.908842] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 650.909145] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 650.912478] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 650.912478] nova-conductor[52331]: Traceback (most recent call last): [ 650.912478] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 650.912478] nova-conductor[52331]: return func(*args, **kwargs) [ 650.912478] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 650.912478] nova-conductor[52331]: selections = self._select_destinations( [ 650.912478] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 650.912478] nova-conductor[52331]: selections = self._schedule( [ 650.912478] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 650.912478] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 650.912478] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 650.912478] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 650.912478] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 650.912478] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 650.913590] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-08e65435-974d-43a8-85e8-9e259a8088b6 tempest-ServersTestJSON-1202886103 tempest-ServersTestJSON-1202886103-project-member] [instance: b631976e-6c45-4b88-a428-4bd121929f79] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.869300] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=68,nova_api:DELETE=6,nova_api:UPDATE=7 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 652.869300] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=54,nova_api:UPDATE=6,nova_api:DELETE=7 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.987687] nova-conductor[52332]: Traceback (most recent call last): [ 652.987687] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.987687] nova-conductor[52332]: return func(*args, **kwargs) [ 652.987687] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.987687] nova-conductor[52332]: selections = self._select_destinations( [ 652.987687] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.987687] nova-conductor[52332]: selections = self._schedule( [ 652.987687] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.987687] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 652.987687] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.987687] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 652.987687] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.987687] nova-conductor[52332]: ERROR nova.conductor.manager [ 652.995702] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 652.995856] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 652.996056] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 653.010447] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=41,nova_cell0:UPDATE=9,nova_cell0:INSERT=48,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 653.055927] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] [instance: f6409b67-58f2-4943-b895-6268716becd9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 653.056989] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.058267] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.058590] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 653.063424] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 653.063424] nova-conductor[52332]: Traceback (most recent call last): [ 653.063424] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 653.063424] nova-conductor[52332]: return func(*args, **kwargs) [ 653.063424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 653.063424] nova-conductor[52332]: selections = self._select_destinations( [ 653.063424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 653.063424] nova-conductor[52332]: selections = self._schedule( [ 653.063424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 653.063424] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 653.063424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 653.063424] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 653.063424] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 653.063424] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 653.064659] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-25024bac-156a-4dd7-848d-e59731d6be1a tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] [instance: f6409b67-58f2-4943-b895-6268716becd9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.101551] nova-conductor[52331]: Traceback (most recent call last): [ 654.101551] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.101551] nova-conductor[52331]: return func(*args, **kwargs) [ 654.101551] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.101551] nova-conductor[52331]: selections = self._select_destinations( [ 654.101551] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.101551] nova-conductor[52331]: selections = self._schedule( [ 654.101551] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.101551] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 654.101551] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.101551] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 654.101551] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.101551] nova-conductor[52331]: ERROR nova.conductor.manager [ 654.109142] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=2,nova_cell1:UPDATE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 654.113498] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.113498] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.113498] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.168862] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 654.180729] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] [instance: 73afde63-8562-44eb-92c7-b86b3adbb552] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 654.181464] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.181771] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.182043] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.186877] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 654.186877] nova-conductor[52331]: Traceback (most recent call last): [ 654.186877] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 654.186877] nova-conductor[52331]: return func(*args, **kwargs) [ 654.186877] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 654.186877] nova-conductor[52331]: selections = self._select_destinations( [ 654.186877] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 654.186877] nova-conductor[52331]: selections = self._schedule( [ 654.186877] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 654.186877] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 654.186877] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 654.186877] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 654.186877] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 654.186877] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.187753] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-21b4e836-7784-4fcd-8664-768f7e1b1ede tempest-VolumesAdminNegativeTest-977207138 tempest-VolumesAdminNegativeTest-977207138-project-member] [instance: 73afde63-8562-44eb-92c7-b86b3adbb552] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 654.202891] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=42,nova_cell0:UPDATE=11,nova_cell0:INSERT=45,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.582847] nova-conductor[52332]: Traceback (most recent call last): [ 655.582847] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.582847] nova-conductor[52332]: return func(*args, **kwargs) [ 655.582847] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.582847] nova-conductor[52332]: selections = self._select_destinations( [ 655.582847] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.582847] nova-conductor[52332]: selections = self._schedule( [ 655.582847] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.582847] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 655.582847] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.582847] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 655.582847] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.582847] nova-conductor[52332]: ERROR nova.conductor.manager [ 655.590317] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.590547] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.590712] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.628541] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] [instance: d4679d1e-f26d-4078-83bc-fb273c4b4f53] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 655.629244] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.629456] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.629621] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.632406] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 655.632406] nova-conductor[52332]: Traceback (most recent call last): [ 655.632406] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.632406] nova-conductor[52332]: return func(*args, **kwargs) [ 655.632406] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.632406] nova-conductor[52332]: selections = self._select_destinations( [ 655.632406] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.632406] nova-conductor[52332]: selections = self._schedule( [ 655.632406] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.632406] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 655.632406] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.632406] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 655.632406] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.632406] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.632933] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-f53e9019-681c-408a-9bed-38d53c33dec7 tempest-ListServerFiltersTestJSON-304228457 tempest-ListServerFiltersTestJSON-304228457-project-member] [instance: d4679d1e-f26d-4078-83bc-fb273c4b4f53] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.814350] nova-conductor[52331]: Traceback (most recent call last): [ 655.814350] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.814350] nova-conductor[52331]: return func(*args, **kwargs) [ 655.814350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.814350] nova-conductor[52331]: selections = self._select_destinations( [ 655.814350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.814350] nova-conductor[52331]: selections = self._schedule( [ 655.814350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.814350] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 655.814350] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.814350] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 655.814350] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.814350] nova-conductor[52331]: ERROR nova.conductor.manager [ 655.821264] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.821508] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.821681] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.886632] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] [instance: ad3cb18a-2cb3-42f9-ac10-9a696756d7a6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 655.887336] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.887540] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.887836] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.891062] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 655.891062] nova-conductor[52331]: Traceback (most recent call last): [ 655.891062] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 655.891062] nova-conductor[52331]: return func(*args, **kwargs) [ 655.891062] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 655.891062] nova-conductor[52331]: selections = self._select_destinations( [ 655.891062] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 655.891062] nova-conductor[52331]: selections = self._schedule( [ 655.891062] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 655.891062] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 655.891062] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 655.891062] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 655.891062] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 655.891062] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 655.891655] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8b41a03c-cbff-4b7b-bb8a-340a385eda5a tempest-ServerShowV257Test-1297398490 tempest-ServerShowV257Test-1297398490-project-member] [instance: ad3cb18a-2cb3-42f9-ac10-9a696756d7a6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 657.808512] nova-conductor[52332]: Traceback (most recent call last): [ 657.808512] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 657.808512] nova-conductor[52332]: return func(*args, **kwargs) [ 657.808512] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 657.808512] nova-conductor[52332]: selections = self._select_destinations( [ 657.808512] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 657.808512] nova-conductor[52332]: selections = self._schedule( [ 657.808512] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 657.808512] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 657.808512] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 657.808512] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 657.808512] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.808512] nova-conductor[52332]: ERROR nova.conductor.manager [ 657.816270] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.816366] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.816499] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.852235] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:INSERT=55,nova_cell0:SELECT=37,nova_cell0:UPDATE=8 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 657.869682] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3df621ab-b814-40a3-9614-de654ff0c6d4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 657.870533] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.870764] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.871729] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.875357] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 657.875357] nova-conductor[52332]: Traceback (most recent call last): [ 657.875357] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 657.875357] nova-conductor[52332]: return func(*args, **kwargs) [ 657.875357] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 657.875357] nova-conductor[52332]: selections = self._select_destinations( [ 657.875357] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 657.875357] nova-conductor[52332]: selections = self._schedule( [ 657.875357] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 657.875357] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 657.875357] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 657.875357] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 657.875357] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 657.875357] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 657.875932] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-cd818c16-f041-47fa-8b50-5f506eeb1dfa tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 3df621ab-b814-40a3-9614-de654ff0c6d4] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.786296] nova-conductor[52331]: Traceback (most recent call last): [ 658.786296] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.786296] nova-conductor[52331]: return func(*args, **kwargs) [ 658.786296] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.786296] nova-conductor[52331]: selections = self._select_destinations( [ 658.786296] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.786296] nova-conductor[52331]: selections = self._schedule( [ 658.786296] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.786296] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 658.786296] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.786296] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 658.786296] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.786296] nova-conductor[52331]: ERROR nova.conductor.manager [ 658.792317] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.792539] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.792711] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.852111] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: b24d6072-f41f-41e1-99f4-046ab592c9f2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 658.852852] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.853027] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.853178] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.864034] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 658.864034] nova-conductor[52331]: Traceback (most recent call last): [ 658.864034] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 658.864034] nova-conductor[52331]: return func(*args, **kwargs) [ 658.864034] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 658.864034] nova-conductor[52331]: selections = self._select_destinations( [ 658.864034] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 658.864034] nova-conductor[52331]: selections = self._schedule( [ 658.864034] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 658.864034] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 658.864034] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 658.864034] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 658.864034] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 658.864034] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 658.864034] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-7fa6f327-f516-4007-a329-92422e0536d9 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: b24d6072-f41f-41e1-99f4-046ab592c9f2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 662.557128] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] [instance: 024de920-7822-450d-93d2-a95bfdd02ff6] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 024de920-7822-450d-93d2-a95bfdd02ff6 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 662.559079] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 662.561884] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 024de920-7822-450d-93d2-a95bfdd02ff6.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 024de920-7822-450d-93d2-a95bfdd02ff6. [ 662.562458] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-2818ef5d-25bb-4c92-ad65-da84b369cc0a tempest-ServerDiagnosticsV248Test-1628898399 tempest-ServerDiagnosticsV248Test-1628898399-project-member] [instance: 024de920-7822-450d-93d2-a95bfdd02ff6] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 024de920-7822-450d-93d2-a95bfdd02ff6. [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 662.793953] nova-conductor[52331]: Traceback (most recent call last): [ 662.793953] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 662.793953] nova-conductor[52331]: return func(*args, **kwargs) [ 662.793953] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 662.793953] nova-conductor[52331]: selections = self._select_destinations( [ 662.793953] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 662.793953] nova-conductor[52331]: selections = self._schedule( [ 662.793953] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 662.793953] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 662.793953] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 662.793953] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 662.793953] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.793953] nova-conductor[52331]: ERROR nova.conductor.manager [ 662.805687] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 662.805687] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 662.806422] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 662.829698] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=38,nova_cell0:INSERT=50,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2,nova_cell0:UPDATE=8 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 662.994767] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] [instance: d6f0ae23-d463-454a-80fa-1c05a74f452d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 662.994767] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 662.994767] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 662.994767] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 662.994767] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 662.994767] nova-conductor[52331]: Traceback (most recent call last): [ 662.994767] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 662.994767] nova-conductor[52331]: return func(*args, **kwargs) [ 662.994767] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 662.994767] nova-conductor[52331]: selections = self._select_destinations( [ 662.994767] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 662.994767] nova-conductor[52331]: selections = self._schedule( [ 662.994767] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 662.994767] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 662.994767] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 662.994767] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 662.994767] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 662.994767] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 662.994767] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-59abb2f7-208e-45a9-8d7a-bf8bda933cd7 tempest-ServersAaction247Test-37369441 tempest-ServersAaction247Test-37369441-project-member] [instance: d6f0ae23-d463-454a-80fa-1c05a74f452d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.591500] nova-conductor[52332]: Traceback (most recent call last): [ 663.591500] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.591500] nova-conductor[52332]: return func(*args, **kwargs) [ 663.591500] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.591500] nova-conductor[52332]: selections = self._select_destinations( [ 663.591500] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.591500] nova-conductor[52332]: selections = self._schedule( [ 663.591500] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.591500] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 663.591500] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.591500] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 663.591500] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.591500] nova-conductor[52332]: ERROR nova.conductor.manager [ 663.598656] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.599181] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 663.599283] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.645348] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] [instance: b1c9e61d-f598-4439-b9dd-63f7ac4663b2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 663.646148] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.646351] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 663.646525] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.651365] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 663.651365] nova-conductor[52332]: Traceback (most recent call last): [ 663.651365] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 663.651365] nova-conductor[52332]: return func(*args, **kwargs) [ 663.651365] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 663.651365] nova-conductor[52332]: selections = self._select_destinations( [ 663.651365] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 663.651365] nova-conductor[52332]: selections = self._schedule( [ 663.651365] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 663.651365] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 663.651365] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 663.651365] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 663.651365] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 663.651365] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 663.651900] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-0c6e54cb-8c44-4aea-88c5-2d7bffd1e5f6 tempest-MigrationsAdminTest-1879415182 tempest-MigrationsAdminTest-1879415182-project-member] [instance: b1c9e61d-f598-4439-b9dd-63f7ac4663b2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 666.987726] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=64,nova_api:UPDATE=7,nova_api:DELETE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 666.990119] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=46,nova_api:DELETE=10,nova_api:UPDATE=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 667.079055] nova-conductor[52331]: Traceback (most recent call last): [ 667.079055] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 667.079055] nova-conductor[52331]: return func(*args, **kwargs) [ 667.079055] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 667.079055] nova-conductor[52331]: selections = self._select_destinations( [ 667.079055] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 667.079055] nova-conductor[52331]: selections = self._schedule( [ 667.079055] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 667.079055] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 667.079055] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 667.079055] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 667.079055] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.079055] nova-conductor[52331]: ERROR nova.conductor.manager [ 667.086462] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.086707] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.086882] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.155607] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 6d997056-e7cf-44c8-9087-f983c836251b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 667.156689] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.157241] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.157241] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.161023] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 667.161023] nova-conductor[52331]: Traceback (most recent call last): [ 667.161023] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 667.161023] nova-conductor[52331]: return func(*args, **kwargs) [ 667.161023] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 667.161023] nova-conductor[52331]: selections = self._select_destinations( [ 667.161023] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 667.161023] nova-conductor[52331]: selections = self._schedule( [ 667.161023] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 667.161023] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 667.161023] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 667.161023] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 667.161023] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 667.161023] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 667.161831] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-12821ee7-ee94-46f2-ba0f-5a60635ca489 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 6d997056-e7cf-44c8-9087-f983c836251b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.078772] nova-conductor[52332]: Traceback (most recent call last): [ 670.078772] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.078772] nova-conductor[52332]: return func(*args, **kwargs) [ 670.078772] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.078772] nova-conductor[52332]: selections = self._select_destinations( [ 670.078772] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.078772] nova-conductor[52332]: selections = self._schedule( [ 670.078772] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.078772] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 670.078772] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.078772] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 670.078772] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.078772] nova-conductor[52332]: ERROR nova.conductor.manager [ 670.085942] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.086156] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.086321] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.126667] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] [instance: 9f0d5df0-768f-4514-b25b-a909cd57e032] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 670.127393] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.127601] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.127768] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.130583] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 670.130583] nova-conductor[52332]: Traceback (most recent call last): [ 670.130583] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 670.130583] nova-conductor[52332]: return func(*args, **kwargs) [ 670.130583] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 670.130583] nova-conductor[52332]: selections = self._select_destinations( [ 670.130583] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 670.130583] nova-conductor[52332]: selections = self._schedule( [ 670.130583] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 670.130583] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 670.130583] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 670.130583] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 670.130583] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 670.130583] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.131098] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-271d0c1f-00ee-4e1f-bf2b-63ff5892e4d8 tempest-InstanceActionsTestJSON-1900562838 tempest-InstanceActionsTestJSON-1900562838-project-member] [instance: 9f0d5df0-768f-4514-b25b-a909cd57e032] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 670.135258] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:INSERT=46,nova_cell0:SELECT=41,nova_cell0:UPDATE=11,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 672.067002] nova-conductor[52331]: Traceback (most recent call last): [ 672.067002] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 672.067002] nova-conductor[52331]: return func(*args, **kwargs) [ 672.067002] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 672.067002] nova-conductor[52331]: selections = self._select_destinations( [ 672.067002] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 672.067002] nova-conductor[52331]: selections = self._schedule( [ 672.067002] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 672.067002] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 672.067002] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 672.067002] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 672.067002] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.067002] nova-conductor[52331]: ERROR nova.conductor.manager [ 672.074358] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.074575] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.075077] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.114132] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:INSERT=53,nova_cell0:SELECT=37,nova_cell0:UPDATE=8,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 672.130611] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] [instance: 19606b3f-781a-4cb4-827b-050095c7aa0d] block_device_mapping [BlockDeviceMapping(attachment_id=314296c4-db9e-4467-8be0-48479696034b,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='750e2a3b-afc0-4e7d-bba8-75710836eefc',volume_size=1,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 672.131384] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.131598] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.131762] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.134905] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 672.134905] nova-conductor[52331]: Traceback (most recent call last): [ 672.134905] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 672.134905] nova-conductor[52331]: return func(*args, **kwargs) [ 672.134905] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 672.134905] nova-conductor[52331]: selections = self._select_destinations( [ 672.134905] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 672.134905] nova-conductor[52331]: selections = self._schedule( [ 672.134905] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 672.134905] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 672.134905] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 672.134905] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 672.134905] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 672.134905] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 672.135905] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-4b0f0fc3-7379-4e64-b193-b08607ad2b7c tempest-ServerActionsV293TestJSON-1520812108 tempest-ServerActionsV293TestJSON-1520812108-project-member] [instance: 19606b3f-781a-4cb4-827b-050095c7aa0d] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.649810] nova-conductor[52331]: Traceback (most recent call last): [ 673.649810] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.649810] nova-conductor[52331]: return func(*args, **kwargs) [ 673.649810] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.649810] nova-conductor[52331]: selections = self._select_destinations( [ 673.649810] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.649810] nova-conductor[52331]: selections = self._schedule( [ 673.649810] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.649810] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 673.649810] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.649810] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 673.649810] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.649810] nova-conductor[52331]: ERROR nova.conductor.manager [ 673.657208] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.657443] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.657636] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.672445] nova-conductor[52332]: Traceback (most recent call last): [ 673.672445] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.672445] nova-conductor[52332]: return func(*args, **kwargs) [ 673.672445] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.672445] nova-conductor[52332]: selections = self._select_destinations( [ 673.672445] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.672445] nova-conductor[52332]: selections = self._schedule( [ 673.672445] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.672445] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 673.672445] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.672445] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 673.672445] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.672445] nova-conductor[52332]: ERROR nova.conductor.manager [ 673.679676] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.679971] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.680368] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.713619] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] [instance: cfd70a44-1ab0-4cbd-9b0a-815976ef5337] block_device_mapping [BlockDeviceMapping(attachment_id=4239bedc-ef30-4270-a01c-0592d4d36751,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='volume',device_name=None,device_type=None,disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id=None,instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='volume',tag=None,updated_at=,uuid=,volume_id='2bee561f-a801-411e-8bb6-faef4afddb03',volume_size=1,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 673.714377] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.714579] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.714745] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.723596] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 673.723596] nova-conductor[52331]: Traceback (most recent call last): [ 673.723596] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.723596] nova-conductor[52331]: return func(*args, **kwargs) [ 673.723596] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.723596] nova-conductor[52331]: selections = self._select_destinations( [ 673.723596] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.723596] nova-conductor[52331]: selections = self._schedule( [ 673.723596] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.723596] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 673.723596] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.723596] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 673.723596] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.723596] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.724041] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-3d8101c7-b35c-4c68-9614-18d4398c7666 tempest-ServersTestBootFromVolume-600305305 tempest-ServersTestBootFromVolume-600305305-project-member] [instance: cfd70a44-1ab0-4cbd-9b0a-815976ef5337] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.724307] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: e2c8f0ba-5a6f-4559-89c4-f442c235de59] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 673.724978] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.725218] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.725391] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.728866] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 673.728866] nova-conductor[52332]: Traceback (most recent call last): [ 673.728866] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 673.728866] nova-conductor[52332]: return func(*args, **kwargs) [ 673.728866] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 673.728866] nova-conductor[52332]: selections = self._select_destinations( [ 673.728866] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 673.728866] nova-conductor[52332]: selections = self._schedule( [ 673.728866] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 673.728866] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 673.728866] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 673.728866] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 673.728866] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 673.728866] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 673.729411] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6e1d035b-621f-4973-8a25-3fad46c5623f tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: e2c8f0ba-5a6f-4559-89c4-f442c235de59] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 674.295600] nova-conductor[52331]: Traceback (most recent call last): [ 674.295600] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 674.295600] nova-conductor[52331]: return func(*args, **kwargs) [ 674.295600] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 674.295600] nova-conductor[52331]: selections = self._select_destinations( [ 674.295600] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 674.295600] nova-conductor[52331]: selections = self._schedule( [ 674.295600] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 674.295600] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 674.295600] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 674.295600] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 674.295600] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.295600] nova-conductor[52331]: ERROR nova.conductor.manager [ 674.304191] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.304423] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.304594] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.368100] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] [instance: 0681aa0f-98cc-4e1a-95d8-709a4f2d57af] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 674.368825] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.369032] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.369198] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.371649] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:INSERT=45,nova_cell0:SELECT=40,nova_cell0:UPDATE=11,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 674.374402] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 674.374402] nova-conductor[52331]: Traceback (most recent call last): [ 674.374402] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 674.374402] nova-conductor[52331]: return func(*args, **kwargs) [ 674.374402] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 674.374402] nova-conductor[52331]: selections = self._select_destinations( [ 674.374402] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 674.374402] nova-conductor[52331]: selections = self._schedule( [ 674.374402] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 674.374402] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 674.374402] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 674.374402] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 674.374402] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 674.374402] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 674.374972] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-2afe938b-24a1-4b32-a720-13b5f63fb71e tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] [instance: 0681aa0f-98cc-4e1a-95d8-709a4f2d57af] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.680252] nova-conductor[52332]: Traceback (most recent call last): [ 675.680252] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 675.680252] nova-conductor[52332]: return func(*args, **kwargs) [ 675.680252] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 675.680252] nova-conductor[52332]: selections = self._select_destinations( [ 675.680252] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 675.680252] nova-conductor[52332]: selections = self._schedule( [ 675.680252] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 675.680252] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 675.680252] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 675.680252] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 675.680252] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.680252] nova-conductor[52332]: ERROR nova.conductor.manager [ 675.686649] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.686858] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.687106] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.741034] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] [instance: 35148a4d-7f4d-46db-ba5f-3690be4656a9] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 675.741780] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.742023] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.742368] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.747535] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 675.747535] nova-conductor[52332]: Traceback (most recent call last): [ 675.747535] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 675.747535] nova-conductor[52332]: return func(*args, **kwargs) [ 675.747535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 675.747535] nova-conductor[52332]: selections = self._select_destinations( [ 675.747535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 675.747535] nova-conductor[52332]: selections = self._schedule( [ 675.747535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 675.747535] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 675.747535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 675.747535] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 675.747535] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 675.747535] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 675.748104] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-a226e934-a46e-4de0-b668-6789617eea4b tempest-ServerMetadataTestJSON-2041824842 tempest-ServerMetadataTestJSON-2041824842-project-member] [instance: 35148a4d-7f4d-46db-ba5f-3690be4656a9] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.862636] nova-conductor[52331]: Traceback (most recent call last): [ 676.862636] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.862636] nova-conductor[52331]: return func(*args, **kwargs) [ 676.862636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.862636] nova-conductor[52331]: selections = self._select_destinations( [ 676.862636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.862636] nova-conductor[52331]: selections = self._schedule( [ 676.862636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.862636] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 676.862636] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.862636] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 676.862636] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.862636] nova-conductor[52331]: ERROR nova.conductor.manager [ 676.870484] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.870747] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.870993] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.938567] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] [instance: e24759aa-ebe7-4aac-8261-a01deac185f7] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 676.939191] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.939396] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.939569] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.942967] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 676.942967] nova-conductor[52331]: Traceback (most recent call last): [ 676.942967] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 676.942967] nova-conductor[52331]: return func(*args, **kwargs) [ 676.942967] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 676.942967] nova-conductor[52331]: selections = self._select_destinations( [ 676.942967] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 676.942967] nova-conductor[52331]: selections = self._schedule( [ 676.942967] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 676.942967] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 676.942967] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 676.942967] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 676.942967] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 676.942967] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 676.943488] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-b218a13f-2ed5-4c0f-ab5d-154c9fbf89ff tempest-ServerAddressesNegativeTestJSON-653593757 tempest-ServerAddressesNegativeTestJSON-653593757-project-member] [instance: e24759aa-ebe7-4aac-8261-a01deac185f7] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.439630] nova-conductor[52332]: Traceback (most recent call last): [ 677.439630] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.439630] nova-conductor[52332]: return func(*args, **kwargs) [ 677.439630] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.439630] nova-conductor[52332]: selections = self._select_destinations( [ 677.439630] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.439630] nova-conductor[52332]: selections = self._schedule( [ 677.439630] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.439630] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 677.439630] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.439630] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 677.439630] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.439630] nova-conductor[52332]: ERROR nova.conductor.manager [ 677.446334] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.446613] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.446830] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.459107] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=41,nova_cell0:UPDATE=9,nova_cell0:INSERT=48,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 677.498364] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] [instance: c69c77b7-9b2a-46b9-9743-ee012b2c0350] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 677.499160] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.499375] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.499541] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.502741] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 677.502741] nova-conductor[52332]: Traceback (most recent call last): [ 677.502741] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 677.502741] nova-conductor[52332]: return func(*args, **kwargs) [ 677.502741] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 677.502741] nova-conductor[52332]: selections = self._select_destinations( [ 677.502741] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 677.502741] nova-conductor[52332]: selections = self._schedule( [ 677.502741] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 677.502741] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 677.502741] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 677.502741] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 677.502741] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 677.502741] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 677.503376] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-f17dec79-b98d-4b38-81ce-ef356731a7d0 tempest-ServerShowV247Test-1484717353 tempest-ServerShowV247Test-1484717353-project-member] [instance: c69c77b7-9b2a-46b9-9743-ee012b2c0350] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 680.631075] nova-conductor[52331]: Traceback (most recent call last): [ 680.631075] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 680.631075] nova-conductor[52331]: return func(*args, **kwargs) [ 680.631075] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 680.631075] nova-conductor[52331]: selections = self._select_destinations( [ 680.631075] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 680.631075] nova-conductor[52331]: selections = self._schedule( [ 680.631075] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 680.631075] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 680.631075] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 680.631075] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 680.631075] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.631075] nova-conductor[52331]: ERROR nova.conductor.manager [ 680.637569] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.638176] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.638176] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 680.696203] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: f8535eca-047b-4107-951d-45ba83e59d1b] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 680.696948] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.697305] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.697553] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 680.705446] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 680.705446] nova-conductor[52331]: Traceback (most recent call last): [ 680.705446] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 680.705446] nova-conductor[52331]: return func(*args, **kwargs) [ 680.705446] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 680.705446] nova-conductor[52331]: selections = self._select_destinations( [ 680.705446] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 680.705446] nova-conductor[52331]: selections = self._schedule( [ 680.705446] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 680.705446] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 680.705446] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 680.705446] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 680.705446] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 680.705446] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 680.705446] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-00ead4c0-b931-4145-9f55-adc2a061bf12 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: f8535eca-047b-4107-951d-45ba83e59d1b] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 682.183381] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 682.195758] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.195982] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.196178] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.235315] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.235597] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.235811] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.236248] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.236516] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.236730] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.247708] nova-conductor[52331]: DEBUG nova.quota [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Getting quotas for project 858f3218a3de48b89a6ec9f62c0ff7fb. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 682.250583] nova-conductor[52331]: DEBUG nova.quota [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Getting quotas for user ddd2a269170b464cabb5168a2d6a2ebd and project 858f3218a3de48b89a6ec9f62c0ff7fb. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 682.258422] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 682.259079] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.259334] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.259534] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.266311] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 682.267075] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.267333] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.267549] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.288399] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.288631] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.288801] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.306946] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=41,nova_api:UPDATE=4,nova_api:DELETE=5 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 683.308408] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=39,nova_api:UPDATE=4,nova_api:DELETE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 683.458291] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 683.472975] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.472975] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.472975] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.504949] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.505179] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.505362] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.505761] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.505934] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.506092] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.517921] nova-conductor[52332]: DEBUG nova.quota [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Getting quotas for project 9b3b70aec5004da6a27e96e248be7815. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 683.520438] nova-conductor[52332]: DEBUG nova.quota [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Getting quotas for user 3d467a52c7b3473ea9749cfacfd09fa3 and project 9b3b70aec5004da6a27e96e248be7815. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 683.529320] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: 22366e9c-9eb5-4f15-864c-d561bea3b452] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 683.532894] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.532894] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.532894] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.533500] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: 22366e9c-9eb5-4f15-864c-d561bea3b452] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 683.534053] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.534256] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.534454] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.550341] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.550599] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.550771] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.909144] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=77,nova_cell1:UPDATE=19,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1,nova_cell1:INSERT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 683.940819] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=74,nova_cell1:UPDATE=20,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1,nova_cell1:INSERT=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 684.924953] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Took 0.19 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 684.936331] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 684.937525] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.937970] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.938213] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.948826] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.949067] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.949223] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.965749] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.966030] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.966243] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.966653] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.966873] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.967062] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.976632] nova-conductor[52331]: DEBUG nova.quota [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Getting quotas for project 5e816839f39445998a5743c757ab25b2. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 684.979523] nova-conductor[52331]: DEBUG nova.quota [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Getting quotas for user e5478d0949a84e83b790d7793c61d652 and project 5e816839f39445998a5743c757ab25b2. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 684.985652] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] [instance: 015d2802-4b5e-4b75-b889-5524857b408d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 684.986221] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.986474] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.986822] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.989506] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] [instance: 015d2802-4b5e-4b75-b889-5524857b408d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 684.990551] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.990818] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.991046] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.991298] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.991517] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.991689] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.992094] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.992288] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.992466] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.003043] nova-conductor[52332]: DEBUG nova.quota [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Getting quotas for project 9a169319587441cdbb9cbec5bff060e6. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 685.006330] nova-conductor[52332]: DEBUG nova.quota [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Getting quotas for user 583c7adfcc52413c9cfb9bb0af13d227 and project 9a169319587441cdbb9cbec5bff060e6. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 685.007489] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.007756] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.007973] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.012426] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] [instance: 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 685.012924] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.013168] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.013347] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.016421] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] [instance: 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 685.017143] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.017342] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.017503] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.053172] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.053394] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.053565] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.062883] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 687.076551] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.077085] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.077419] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.126008] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.126387] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.126865] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.127249] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.127740] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.127834] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.134517] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:INSERT=45,nova_cell0:SELECT=44,nova_cell0:UPDATE=9,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 687.144784] nova-conductor[52331]: DEBUG nova.quota [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Getting quotas for project d93bce0fc7764f478852c8f8d69a1b3f. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 687.148222] nova-conductor[52331]: DEBUG nova.quota [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Getting quotas for user 022eb827078647fdacb9f8abaaacc64e and project d93bce0fc7764f478852c8f8d69a1b3f. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 687.161305] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] [instance: 40bce84f-b0d8-44e4-841f-373bc27ba5df] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 687.162065] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.162397] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.162694] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.166628] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] [instance: 40bce84f-b0d8-44e4-841f-373bc27ba5df] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 687.167610] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.167939] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.168265] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.190435] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 687.190925] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.191134] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 687.659169] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=76,nova_cell1:UPDATE=21,nova_cell1:INSERT=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 691.314099] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=76,nova_cell1:UPDATE=21,nova_cell1:INSERT=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 692.652261] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port c86eb909-634d-4322-8859-64c52ec161a6, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6 was re-scheduled: Binding failed for port c86eb909-634d-4322-8859-64c52ec161a6, please check neutron logs for more information.\n'] [ 692.652950] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 692.653196] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6. [ 692.653512] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6. [ 692.681490] nova-conductor[52332]: DEBUG nova.network.neutron [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 692.773794] nova-conductor[52332]: DEBUG nova.network.neutron [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 692.777136] nova-conductor[52332]: DEBUG nova.network.neutron [None req-d1c90e8a-7bcd-4de3-bced-2cd71ade6f73 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 00d2c3a4-1970-4aeb-8e84-5c5c25c3c9c6] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.528879] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 693.543071] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.543351] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.543558] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.578089] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.578430] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.578653] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.579030] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.579209] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.579363] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.589706] nova-conductor[52332]: DEBUG nova.quota [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Getting quotas for project f31d6cd1c9a045beacfc60af31d1ffad. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 693.592100] nova-conductor[52332]: DEBUG nova.quota [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Getting quotas for user 32d3f462daeb4beb923d97ca93470591 and project f31d6cd1c9a045beacfc60af31d1ffad. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 693.598186] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 33ef045b-5214-4386-a661-d6c01626f392] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 693.598650] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.598908] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.599104] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.602367] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 33ef045b-5214-4386-a661-d6c01626f392] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 693.603194] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.603559] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.603725] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.627867] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.627867] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.627867] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.846437] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: 22366e9c-9eb5-4f15-864c-d561bea3b452] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 322598de-492f-48b7-a71c-764d7051ba79, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 22366e9c-9eb5-4f15-864c-d561bea3b452 was re-scheduled: Binding failed for port 322598de-492f-48b7-a71c-764d7051ba79, please check neutron logs for more information.\n'] [ 693.847084] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 693.847367] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 22366e9c-9eb5-4f15-864c-d561bea3b452.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 22366e9c-9eb5-4f15-864c-d561bea3b452. [ 693.847624] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: 22366e9c-9eb5-4f15-864c-d561bea3b452] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 22366e9c-9eb5-4f15-864c-d561bea3b452. [ 693.908665] nova-conductor[52332]: DEBUG nova.network.neutron [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: 22366e9c-9eb5-4f15-864c-d561bea3b452] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 694.063573] nova-conductor[52332]: DEBUG nova.network.neutron [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: 22366e9c-9eb5-4f15-864c-d561bea3b452] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 694.067099] nova-conductor[52332]: DEBUG nova.network.neutron [None req-904ba05a-bdd4-42f9-804f-8a0c8714fa33 tempest-ImagesTestJSON-2132737286 tempest-ImagesTestJSON-2132737286-project-member] [instance: 22366e9c-9eb5-4f15-864c-d561bea3b452] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.102423] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=15,nova_cell1:SELECT=67,nova_cell1:SAVEPOINT=7,nova_cell1:RELEASE=7,nova_cell1:INSERT=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 694.110749] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SAVEPOINT=6,nova_cell1:RELEASE=5,nova_cell1:SELECT=66,nova_cell1:UPDATE=21,nova_cell1:INSERT=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 694.816199] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] [instance: 015d2802-4b5e-4b75-b889-5524857b408d] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 8e4050e9-12ac-4c24-9567-acfda0eee794, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 015d2802-4b5e-4b75-b889-5524857b408d was re-scheduled: Binding failed for port 8e4050e9-12ac-4c24-9567-acfda0eee794, please check neutron logs for more information.\n'] [ 694.817110] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 694.817526] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 015d2802-4b5e-4b75-b889-5524857b408d.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 015d2802-4b5e-4b75-b889-5524857b408d. [ 694.818040] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] [instance: 015d2802-4b5e-4b75-b889-5524857b408d] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 015d2802-4b5e-4b75-b889-5524857b408d. [ 694.844358] nova-conductor[52332]: DEBUG nova.network.neutron [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] [instance: 015d2802-4b5e-4b75-b889-5524857b408d] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 694.884280] nova-conductor[52332]: DEBUG nova.network.neutron [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] [instance: 015d2802-4b5e-4b75-b889-5524857b408d] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 694.887704] nova-conductor[52332]: DEBUG nova.network.neutron [None req-f46e8ff1-5623-4ad9-98d7-9a27a0488d19 tempest-ServerRescueTestJSONUnderV235-1391471308 tempest-ServerRescueTestJSONUnderV235-1391471308-project-member] [instance: 015d2802-4b5e-4b75-b889-5524857b408d] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.119463] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 696.136919] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.137415] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.137415] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.154741] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=30,nova_cell1:SAVEPOINT=3,nova_cell1:INSERT=64,nova_cell1:RELEASE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 696.170360] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.170636] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.170880] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.171277] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.171530] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.171747] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.181857] nova-conductor[52332]: DEBUG nova.quota [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Getting quotas for project 2f1aedc60ff04a8a80cb53e0d26c0c09. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 696.184381] nova-conductor[52332]: DEBUG nova.quota [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Getting quotas for user c69618225a754dbca1531e3b05670f9c and project 2f1aedc60ff04a8a80cb53e0d26c0c09. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 696.190754] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] [instance: b9fcaf7b-c221-4bed-8838-39ac0f04a62a] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 696.191293] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.191596] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.191879] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.195306] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] [instance: b9fcaf7b-c221-4bed-8838-39ac0f04a62a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 696.196361] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.196618] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.196894] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.219662] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.219968] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.220366] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.278853] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=16,nova_cell1:SELECT=70,nova_cell1:INSERT=2,nova_cell1:SAVEPOINT=6,nova_cell1:RELEASE=6 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 696.285276] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 33ef045b-5214-4386-a661-d6c01626f392] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 3ba2b939-33b6-47cc-9cd7-ee2e850bebd8, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 33ef045b-5214-4386-a661-d6c01626f392 was re-scheduled: Binding failed for port 3ba2b939-33b6-47cc-9cd7-ee2e850bebd8, please check neutron logs for more information.\n'] [ 696.286256] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 696.286615] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 33ef045b-5214-4386-a661-d6c01626f392.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 33ef045b-5214-4386-a661-d6c01626f392. [ 696.287085] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 33ef045b-5214-4386-a661-d6c01626f392] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 33ef045b-5214-4386-a661-d6c01626f392. [ 696.315931] nova-conductor[52331]: DEBUG nova.network.neutron [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 33ef045b-5214-4386-a661-d6c01626f392] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 696.424238] nova-conductor[52331]: DEBUG nova.network.neutron [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 33ef045b-5214-4386-a661-d6c01626f392] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 696.428118] nova-conductor[52331]: DEBUG nova.network.neutron [None req-c6a5d7c8-df77-4b2f-bc23-38003504adfd tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 33ef045b-5214-4386-a661-d6c01626f392] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.521774] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=66,nova_cell1:UPDATE=23,nova_cell1:RELEASE=4,nova_cell1:SAVEPOINT=3,nova_cell1:INSERT=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 697.138534] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 697.170920] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=27,nova_cell1:SAVEPOINT=3,nova_cell1:INSERT=60,nova_cell1:RELEASE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 697.181830] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=67,nova_api:UPDATE=3,nova_api:DELETE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 697.188038] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=63,nova_api:UPDATE=6,nova_api:DELETE=6 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 697.549078] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Took 0.20 seconds to select destinations for 2 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 697.563321] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.563321] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.563321] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.596538] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.596859] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.597034] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.627006] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.627225] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.627396] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.627923] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.628144] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.628337] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.636914] nova-conductor[52331]: DEBUG nova.quota [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Getting quotas for project 3855ad934a0e461b821e48fd1680bcb9. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 697.639219] nova-conductor[52331]: DEBUG nova.quota [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Getting quotas for user 978b666d27814afabf1e2b71169d4b26 and project 3855ad934a0e461b821e48fd1680bcb9. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 697.644752] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: bc10b0e6-e56e-4c07-87aa-a552c4c5ce10] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 697.645212] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.645417] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.645585] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.649895] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: bc10b0e6-e56e-4c07-87aa-a552c4c5ce10] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 697.650695] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.650935] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.651113] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.665156] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.665385] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.665555] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.672284] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: 1a83a3c2-0f15-4d6d-ba2e-921485ade456] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 697.672931] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.673240] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.673507] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.682651] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: 1a83a3c2-0f15-4d6d-ba2e-921485ade456] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 697.682651] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.682651] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.682651] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.743094] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.743478] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.743791] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.069274] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] [instance: 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 14fefe55-c951-4cf6-ab1a-964e02b022b7, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb was re-scheduled: Binding failed for port 14fefe55-c951-4cf6-ab1a-964e02b022b7, please check neutron logs for more information.\n'] [ 698.069558] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 698.069786] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb. [ 698.070115] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] [instance: 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb. [ 698.099360] nova-conductor[52332]: DEBUG nova.network.neutron [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] [instance: 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 698.188192] nova-conductor[52332]: DEBUG nova.network.neutron [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] [instance: 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 698.193823] nova-conductor[52332]: DEBUG nova.network.neutron [None req-376834b2-2104-4ee4-b674-816455753229 tempest-ServerRescueTestJSON-655602520 tempest-ServerRescueTestJSON-655602520-project-member] [instance: 67ef21f1-b4f7-4f2d-bf05-691a9d583aeb] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.205081] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=70,nova_cell1:UPDATE=19,nova_cell1:SAVEPOINT=4,nova_cell1:RELEASE=4,nova_cell1:INSERT=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 698.388806] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=72,nova_cell1:UPDATE=21,nova_cell1:SAVEPOINT=2,nova_cell1:RELEASE=2,nova_cell1:INSERT=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 699.091745] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] [instance: 40bce84f-b0d8-44e4-841f-373bc27ba5df] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port c60d1d81-f3a8-4029-a0f0-6415cbb22a2e, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 40bce84f-b0d8-44e4-841f-373bc27ba5df was re-scheduled: Binding failed for port c60d1d81-f3a8-4029-a0f0-6415cbb22a2e, please check neutron logs for more information.\n'] [ 699.092346] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 699.092567] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 40bce84f-b0d8-44e4-841f-373bc27ba5df.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 40bce84f-b0d8-44e4-841f-373bc27ba5df. [ 699.092898] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] [instance: 40bce84f-b0d8-44e4-841f-373bc27ba5df] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 40bce84f-b0d8-44e4-841f-373bc27ba5df. [ 699.113391] nova-conductor[52331]: DEBUG nova.network.neutron [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] [instance: 40bce84f-b0d8-44e4-841f-373bc27ba5df] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 699.355999] nova-conductor[52331]: DEBUG nova.network.neutron [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] [instance: 40bce84f-b0d8-44e4-841f-373bc27ba5df] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 699.360159] nova-conductor[52331]: DEBUG nova.network.neutron [None req-410a9374-be31-4030-86b9-e02148432e1f tempest-InstanceActionsV221TestJSON-1604835679 tempest-InstanceActionsV221TestJSON-1604835679-project-member] [instance: 40bce84f-b0d8-44e4-841f-373bc27ba5df] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 699.379409] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] [instance: b9fcaf7b-c221-4bed-8838-39ac0f04a62a] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 5a349fd5-0e24-41c8-8430-d2247f8f75cb, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance b9fcaf7b-c221-4bed-8838-39ac0f04a62a was re-scheduled: Binding failed for port 5a349fd5-0e24-41c8-8430-d2247f8f75cb, please check neutron logs for more information.\n'] [ 699.379988] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 699.380809] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b9fcaf7b-c221-4bed-8838-39ac0f04a62a.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b9fcaf7b-c221-4bed-8838-39ac0f04a62a. [ 699.381024] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] [instance: b9fcaf7b-c221-4bed-8838-39ac0f04a62a] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance b9fcaf7b-c221-4bed-8838-39ac0f04a62a. [ 699.401181] nova-conductor[52332]: DEBUG nova.network.neutron [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] [instance: b9fcaf7b-c221-4bed-8838-39ac0f04a62a] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 699.677589] nova-conductor[52332]: DEBUG nova.network.neutron [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] [instance: b9fcaf7b-c221-4bed-8838-39ac0f04a62a] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 699.683329] nova-conductor[52332]: DEBUG nova.network.neutron [None req-1a55cac4-60e4-4722-a8aa-b546eafe3481 tempest-InstanceActionsNegativeTestJSON-483395512 tempest-InstanceActionsNegativeTestJSON-483395512-project-member] [instance: b9fcaf7b-c221-4bed-8838-39ac0f04a62a] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 700.073788] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 700.084878] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.085241] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.085300] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.128754] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.128754] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.128754] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.129205] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.129271] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.129419] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.138784] nova-conductor[52332]: DEBUG nova.quota [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Getting quotas for project 780394603c614e6385d7cbb31b22cffe. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 700.141176] nova-conductor[52332]: DEBUG nova.quota [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Getting quotas for user a5f7e4b1a1ca49328595635041bb4d2b and project 780394603c614e6385d7cbb31b22cffe. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 700.147136] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: d77dcb9a-4f92-4dca-9182-cf8a5676cfd5] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 700.147459] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.147563] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.147849] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.150586] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: d77dcb9a-4f92-4dca-9182-cf8a5676cfd5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 700.151334] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.151334] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.151676] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.165727] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.165966] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.166105] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.198982] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SAVEPOINT=4,nova_cell1:UPDATE=22,nova_cell1:RELEASE=4,nova_cell1:SELECT=68,nova_cell1:INSERT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 700.517571] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=66,nova_cell1:UPDATE=20,nova_cell1:SAVEPOINT=5,nova_cell1:RELEASE=5,nova_cell1:INSERT=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 704.119628] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 704.136090] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.136090] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.136090] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.165783] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.165783] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.165783] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.165783] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.165783] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.165783] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.177171] nova-conductor[52331]: DEBUG nova.quota [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Getting quotas for project f31d6cd1c9a045beacfc60af31d1ffad. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 704.180030] nova-conductor[52331]: DEBUG nova.quota [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Getting quotas for user 32d3f462daeb4beb923d97ca93470591 and project f31d6cd1c9a045beacfc60af31d1ffad. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 704.186885] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 50087e3e-214f-4235-b984-7c68143742ba] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 704.187490] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.187797] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.188090] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.191140] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 50087e3e-214f-4235-b984-7c68143742ba] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 704.191982] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.192313] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.192599] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.208520] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.208735] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.208907] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.267815] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 707.279802] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.280040] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.280224] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.296402] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=28,nova_cell1:SAVEPOINT=2,nova_cell1:INSERT=68,nova_cell1:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 707.316828] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.317071] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.317243] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.317600] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.317781] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.317940] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.326335] nova-conductor[52331]: DEBUG nova.quota [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Getting quotas for project 522cbd160e594fe2b192ce3aeaa6c682. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 707.328640] nova-conductor[52331]: DEBUG nova.quota [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Getting quotas for user b93e03caf34f46a488174c088efa722f and project 522cbd160e594fe2b192ce3aeaa6c682. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 707.334600] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 73b2e13d-9434-48f2-8ca5-93336112814f] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 707.335727] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.335727] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.335727] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.338623] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 73b2e13d-9434-48f2-8ca5-93336112814f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 707.339155] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.339356] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.339518] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.353574] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.354179] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.354482] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.465357] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=73,nova_cell1:UPDATE=21,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 709.181697] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=77,nova_cell1:UPDATE=19,nova_cell1:SAVEPOINT=2,nova_cell1:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 709.287002] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: d77dcb9a-4f92-4dca-9182-cf8a5676cfd5] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 4d1e45c4-185f-4fce-94c9-c4e6f30271ac, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance d77dcb9a-4f92-4dca-9182-cf8a5676cfd5 was re-scheduled: Binding failed for port 4d1e45c4-185f-4fce-94c9-c4e6f30271ac, please check neutron logs for more information.\n'] [ 709.287883] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 709.288129] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d77dcb9a-4f92-4dca-9182-cf8a5676cfd5.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d77dcb9a-4f92-4dca-9182-cf8a5676cfd5. [ 709.288541] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: d77dcb9a-4f92-4dca-9182-cf8a5676cfd5] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d77dcb9a-4f92-4dca-9182-cf8a5676cfd5. [ 709.316151] nova-conductor[52331]: DEBUG nova.network.neutron [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: d77dcb9a-4f92-4dca-9182-cf8a5676cfd5] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 709.498205] nova-conductor[52331]: DEBUG nova.network.neutron [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: d77dcb9a-4f92-4dca-9182-cf8a5676cfd5] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 709.505397] nova-conductor[52331]: DEBUG nova.network.neutron [None req-7cb9a9c9-6bae-43a7-9f29-4cd28d8d77dd tempest-SecurityGroupsTestJSON-1291402501 tempest-SecurityGroupsTestJSON-1291402501-project-member] [instance: d77dcb9a-4f92-4dca-9182-cf8a5676cfd5] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.030249] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] [instance: 16ad02eb-ec64-475a-baeb-f995578b154d] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 16ad02eb-ec64-475a-baeb-f995578b154d was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 710.032265] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 710.032932] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 16ad02eb-ec64-475a-baeb-f995578b154d.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 16ad02eb-ec64-475a-baeb-f995578b154d. [ 710.039638] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-4e124a28-1cf4-4e90-bf63-969df95c76e0 tempest-ServersAdmin275Test-635699095 tempest-ServersAdmin275Test-635699095-project-member] [instance: 16ad02eb-ec64-475a-baeb-f995578b154d] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 16ad02eb-ec64-475a-baeb-f995578b154d. [ 710.133533] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:INSERT=17,nova_cell0:SELECT=24,nova_cell0:UPDATE=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 710.153614] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:INSERT=35,nova_cell1:SELECT=15,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 710.212770] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 710.229406] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.229406] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.231859] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.272803] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.273116] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.273474] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.273904] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.274460] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.274460] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.283501] nova-conductor[52332]: DEBUG nova.quota [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Getting quotas for project 858f3218a3de48b89a6ec9f62c0ff7fb. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 710.286312] nova-conductor[52332]: DEBUG nova.quota [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Getting quotas for user ddd2a269170b464cabb5168a2d6a2ebd and project 858f3218a3de48b89a6ec9f62c0ff7fb. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 710.293562] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 8366297c-0710-4598-88ab-670de3bfc8ab] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 710.294075] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.294378] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.294577] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.297475] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 8366297c-0710-4598-88ab-670de3bfc8ab] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 710.309443] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.309721] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.309831] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.325228] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.325598] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.325682] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.995495] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=71,nova_cell1:SAVEPOINT=4,nova_cell1:RELEASE=4,nova_cell1:UPDATE=19,nova_cell1:INSERT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 711.410874] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:UPDATE=20,nova_cell1:SELECT=66,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=5,nova_cell1:RELEASE=5 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 711.495268] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 711.509874] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.509874] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.509874] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.540254] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.540517] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.002s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.540690] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.541063] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.541248] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.541407] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.552477] nova-conductor[52332]: DEBUG nova.quota [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Getting quotas for project af011e99f4dc43bba2d1aab240404cad. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 711.557020] nova-conductor[52332]: DEBUG nova.quota [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Getting quotas for user 019ec2740fb84bd2abb67020e018d221 and project af011e99f4dc43bba2d1aab240404cad. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 711.562023] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] [instance: c97aeba9-a651-449c-8b7f-22bad95800b6] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 711.562598] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.562986] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.563209] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.566422] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] [instance: c97aeba9-a651-449c-8b7f-22bad95800b6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 711.567127] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.567366] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.567561] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.581043] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.581315] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.581519] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.979552] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: bc10b0e6-e56e-4c07-87aa-a552c4c5ce10] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 87a0a56c-dd7e-48e4-967e-d471997aa72b, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance bc10b0e6-e56e-4c07-87aa-a552c4c5ce10 was re-scheduled: Binding failed for port 87a0a56c-dd7e-48e4-967e-d471997aa72b, please check neutron logs for more information.\n'] [ 711.980174] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 711.980400] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance bc10b0e6-e56e-4c07-87aa-a552c4c5ce10.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance bc10b0e6-e56e-4c07-87aa-a552c4c5ce10. [ 711.980820] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: bc10b0e6-e56e-4c07-87aa-a552c4c5ce10] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance bc10b0e6-e56e-4c07-87aa-a552c4c5ce10. [ 712.016325] nova-conductor[52332]: DEBUG nova.network.neutron [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: bc10b0e6-e56e-4c07-87aa-a552c4c5ce10] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 712.192032] nova-conductor[52332]: DEBUG nova.network.neutron [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: bc10b0e6-e56e-4c07-87aa-a552c4c5ce10] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 712.192763] nova-conductor[52332]: DEBUG nova.network.neutron [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: bc10b0e6-e56e-4c07-87aa-a552c4c5ce10] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.429292] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: 1a83a3c2-0f15-4d6d-ba2e-921485ade456] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 40cad3d5-0d84-4c7e-95d7-33f79db569f1, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 1a83a3c2-0f15-4d6d-ba2e-921485ade456 was re-scheduled: Binding failed for port 40cad3d5-0d84-4c7e-95d7-33f79db569f1, please check neutron logs for more information.\n'] [ 712.429931] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 712.430174] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1a83a3c2-0f15-4d6d-ba2e-921485ade456.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1a83a3c2-0f15-4d6d-ba2e-921485ade456. [ 712.430376] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: 1a83a3c2-0f15-4d6d-ba2e-921485ade456] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 1a83a3c2-0f15-4d6d-ba2e-921485ade456. [ 712.454193] nova-conductor[52331]: DEBUG nova.network.neutron [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: 1a83a3c2-0f15-4d6d-ba2e-921485ade456] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 712.538537] nova-conductor[52331]: DEBUG nova.network.neutron [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: 1a83a3c2-0f15-4d6d-ba2e-921485ade456] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 712.546495] nova-conductor[52331]: DEBUG nova.network.neutron [None req-6e5f8f04-87f4-4ccc-ae2d-ec67339e86cb tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: 1a83a3c2-0f15-4d6d-ba2e-921485ade456] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.026028] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Took 0.13 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 714.037204] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.040370] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.003s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.040572] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.077018] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.077245] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.078564] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.078564] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.078564] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.078564] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.092449] nova-conductor[52331]: DEBUG nova.quota [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Getting quotas for project d3230ca030e34bff8fb3ae259eb9efa5. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 714.095036] nova-conductor[52331]: DEBUG nova.quota [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Getting quotas for user 5ab19533eaa6431c8475cf8c3f31c7e1 and project d3230ca030e34bff8fb3ae259eb9efa5. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 714.103757] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] [instance: d68d2759-8d2b-4b88-b005-0afe4beb7792] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 714.104441] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.104829] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.105074] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.107971] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] [instance: d68d2759-8d2b-4b88-b005-0afe4beb7792] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 714.109127] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.109127] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.109258] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.125108] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.125336] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.125568] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.193021] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=70,nova_cell1:SAVEPOINT=5,nova_cell1:RELEASE=5,nova_cell1:UPDATE=17,nova_cell1:INSERT=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 714.241101] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=69,nova_cell1:UPDATE=23,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=2,nova_cell1:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 715.792979] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 50087e3e-214f-4235-b984-7c68143742ba] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port d8e32bdd-4ae9-46f2-ba16-b3153129efe4, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 50087e3e-214f-4235-b984-7c68143742ba was re-scheduled: Binding failed for port d8e32bdd-4ae9-46f2-ba16-b3153129efe4, please check neutron logs for more information.\n'] [ 715.794210] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 715.794553] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50087e3e-214f-4235-b984-7c68143742ba.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50087e3e-214f-4235-b984-7c68143742ba. [ 715.795017] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 50087e3e-214f-4235-b984-7c68143742ba] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 50087e3e-214f-4235-b984-7c68143742ba. [ 715.819495] nova-conductor[52331]: DEBUG nova.network.neutron [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 50087e3e-214f-4235-b984-7c68143742ba] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 715.871432] nova-conductor[52331]: DEBUG nova.network.neutron [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 50087e3e-214f-4235-b984-7c68143742ba] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 715.877224] nova-conductor[52331]: DEBUG nova.network.neutron [None req-d3143d92-ef49-429f-8d13-d8f40fd29702 tempest-DeleteServersTestJSON-1350873409 tempest-DeleteServersTestJSON-1350873409-project-member] [instance: 50087e3e-214f-4235-b984-7c68143742ba] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.680448] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=66,nova_cell1:UPDATE=17,nova_cell1:INSERT=3,nova_cell1:SAVEPOINT=7,nova_cell1:RELEASE=7 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 720.141679] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=71,nova_cell1:UPDATE=15,nova_cell1:SAVEPOINT=7,nova_cell1:RELEASE=7 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 720.325964] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 8366297c-0710-4598-88ab-670de3bfc8ab] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 7f998a9e-0e7a-4708-8c55-86ec46c9877a, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 8366297c-0710-4598-88ab-670de3bfc8ab was re-scheduled: Binding failed for port 7f998a9e-0e7a-4708-8c55-86ec46c9877a, please check neutron logs for more information.\n'] [ 720.326548] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 720.326777] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8366297c-0710-4598-88ab-670de3bfc8ab.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8366297c-0710-4598-88ab-670de3bfc8ab. [ 720.326990] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 8366297c-0710-4598-88ab-670de3bfc8ab] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8366297c-0710-4598-88ab-670de3bfc8ab. [ 720.362040] nova-conductor[52331]: DEBUG nova.network.neutron [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 8366297c-0710-4598-88ab-670de3bfc8ab] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 720.506916] nova-conductor[52331]: DEBUG nova.network.neutron [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 8366297c-0710-4598-88ab-670de3bfc8ab] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 720.521996] nova-conductor[52331]: DEBUG nova.network.neutron [None req-c9f485df-ffcb-44dc-816c-0fb4ddf9a849 tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 8366297c-0710-4598-88ab-670de3bfc8ab] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.721348] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 73b2e13d-9434-48f2-8ca5-93336112814f] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port cb303948-c8b4-49ec-bff0-2425c1455ef5, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 73b2e13d-9434-48f2-8ca5-93336112814f was re-scheduled: Binding failed for port cb303948-c8b4-49ec-bff0-2425c1455ef5, please check neutron logs for more information.\n'] [ 720.722239] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 720.722465] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 73b2e13d-9434-48f2-8ca5-93336112814f.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 73b2e13d-9434-48f2-8ca5-93336112814f. [ 720.722671] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 73b2e13d-9434-48f2-8ca5-93336112814f] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 73b2e13d-9434-48f2-8ca5-93336112814f. [ 720.743478] nova-conductor[52331]: DEBUG nova.network.neutron [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 73b2e13d-9434-48f2-8ca5-93336112814f] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 720.873144] nova-conductor[52331]: DEBUG nova.network.neutron [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 73b2e13d-9434-48f2-8ca5-93336112814f] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 720.878899] nova-conductor[52331]: DEBUG nova.network.neutron [None req-af7eab99-c1e2-42cd-b159-94700ab41ef0 tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 73b2e13d-9434-48f2-8ca5-93336112814f] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.132651] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] [instance: c97aeba9-a651-449c-8b7f-22bad95800b6] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port fefae542-67ee-4463-ac50-8d92f3d00fcb, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance c97aeba9-a651-449c-8b7f-22bad95800b6 was re-scheduled: Binding failed for port fefae542-67ee-4463-ac50-8d92f3d00fcb, please check neutron logs for more information.\n'] [ 721.133103] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 721.133333] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c97aeba9-a651-449c-8b7f-22bad95800b6.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c97aeba9-a651-449c-8b7f-22bad95800b6. [ 721.133540] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] [instance: c97aeba9-a651-449c-8b7f-22bad95800b6] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c97aeba9-a651-449c-8b7f-22bad95800b6. [ 721.152851] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=63,nova_cell1:UPDATE=19,nova_cell1:SAVEPOINT=7,nova_cell1:RELEASE=7,nova_cell1:INSERT=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 721.158358] nova-conductor[52331]: DEBUG nova.network.neutron [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] [instance: c97aeba9-a651-449c-8b7f-22bad95800b6] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 721.214273] nova-conductor[52331]: DEBUG nova.network.neutron [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] [instance: c97aeba9-a651-449c-8b7f-22bad95800b6] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 721.226197] nova-conductor[52331]: DEBUG nova.network.neutron [None req-697143ca-abf2-4054-9cfa-8733285ecaef tempest-ServerPasswordTestJSON-798403229 tempest-ServerPasswordTestJSON-798403229-project-member] [instance: c97aeba9-a651-449c-8b7f-22bad95800b6] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.314917] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] [instance: d68d2759-8d2b-4b88-b005-0afe4beb7792] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 4bd836ba-91d0-4f5a-b971-168588a68707, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance d68d2759-8d2b-4b88-b005-0afe4beb7792 was re-scheduled: Binding failed for port 4bd836ba-91d0-4f5a-b971-168588a68707, please check neutron logs for more information.\n'] [ 721.315879] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 721.316349] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d68d2759-8d2b-4b88-b005-0afe4beb7792.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d68d2759-8d2b-4b88-b005-0afe4beb7792. [ 721.316765] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] [instance: d68d2759-8d2b-4b88-b005-0afe4beb7792] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d68d2759-8d2b-4b88-b005-0afe4beb7792. [ 721.374489] nova-conductor[52332]: DEBUG nova.network.neutron [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] [instance: d68d2759-8d2b-4b88-b005-0afe4beb7792] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 721.447573] nova-conductor[52332]: DEBUG nova.network.neutron [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] [instance: d68d2759-8d2b-4b88-b005-0afe4beb7792] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 721.452785] nova-conductor[52332]: DEBUG nova.network.neutron [None req-6ecdeff4-34f3-4c67-a943-88458d1bf135 tempest-ImagesOneServerTestJSON-1677098362 tempest-ImagesOneServerTestJSON-1677098362-project-member] [instance: d68d2759-8d2b-4b88-b005-0afe4beb7792] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.486049] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Took 0.21 seconds to select destinations for 2 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 721.498721] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.498982] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.499125] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.534947] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.535180] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.535350] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.546487] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 721.562483] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.562906] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.563232] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.563733] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.564103] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.564415] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.569900] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=18,nova_cell1:INSERT=39,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 721.573244] nova-conductor[52331]: DEBUG nova.quota [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Getting quotas for project 3855ad934a0e461b821e48fd1680bcb9. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 721.575340] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=85,nova_api:UPDATE=5,nova_api:DELETE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 721.575901] nova-conductor[52331]: DEBUG nova.quota [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Getting quotas for user 978b666d27814afabf1e2b71169d4b26 and project 3855ad934a0e461b821e48fd1680bcb9. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 721.579937] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=81,nova_api:UPDATE=2,nova_api:DELETE=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 721.581958] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: a5880165-fe4c-4453-b045-2e4968d40f55] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 721.582547] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.582890] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.583194] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.586365] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: a5880165-fe4c-4453-b045-2e4968d40f55] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 721.587452] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.587847] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.588239] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.604377] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.604767] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.605103] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.612066] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: e3009964-caae-4594-b2a0-7eff079abcb2] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 721.612530] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.612728] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.612893] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.616222] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: e3009964-caae-4594-b2a0-7eff079abcb2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 721.616946] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.617160] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.617278] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.619692] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:INSERT=67,nova_cell1:SELECT=31,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 721.622137] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=66,nova_cell1:UPDATE=22,nova_cell1:SAVEPOINT=5,nova_cell1:RELEASE=5,nova_cell1:INSERT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 721.634223] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.634426] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.634593] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.257900] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Took 0.16 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 722.271241] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.272154] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.272469] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.339106] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.339383] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.339489] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.339842] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.340026] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.340181] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.358620] nova-conductor[52332]: DEBUG nova.quota [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Getting quotas for project 1964b18f63824ec79b06da01fba78cdd. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 722.361121] nova-conductor[52332]: DEBUG nova.quota [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Getting quotas for user 24abae32983f44ca98266b910b1b1ea4 and project 1964b18f63824ec79b06da01fba78cdd. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 722.376185] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] [instance: f3032222-3305-4983-b49c-1a44ddb4b1a5] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 722.376673] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.376876] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.377069] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.385554] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] [instance: f3032222-3305-4983-b49c-1a44ddb4b1a5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 722.386308] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.386510] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.386674] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.407891] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.408136] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.408306] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.529487] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:INSERT=4,nova_cell1:SELECT=71,nova_cell1:UPDATE=23,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 723.886897] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 723.899446] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.899809] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.899876] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.941635] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.941635] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.941635] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.941971] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.942269] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.942364] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.952161] nova-conductor[52331]: DEBUG nova.quota [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Getting quotas for project 2ffd7a02cb8a46ec9339861021619401. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 723.954594] nova-conductor[52331]: DEBUG nova.quota [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Getting quotas for user 77c3a1cfbbb54bbbadc9220456641af1 and project 2ffd7a02cb8a46ec9339861021619401. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 723.960301] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] [instance: 3892ebb3-f231-452f-a0cc-c25b7201ef34] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 723.960808] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.961006] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.961170] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.964527] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] [instance: 3892ebb3-f231-452f-a0cc-c25b7201ef34] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 723.965184] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.965373] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.965533] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.980322] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.980550] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.980690] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.337783] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=22,nova_cell1:INSERT=5,nova_cell1:SELECT=73 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 729.016465] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 729.029706] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.029948] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.030117] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.077452] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.077452] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.077452] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.077764] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.077764] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.077897] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.090599] nova-conductor[52332]: DEBUG nova.quota [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Getting quotas for project 735229d6d52243f28e90cffa3e7e0512. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 729.093256] nova-conductor[52332]: DEBUG nova.quota [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Getting quotas for user b2525ee6d73f442eb4f12a8f49b2a0a1 and project 735229d6d52243f28e90cffa3e7e0512. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 729.100230] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] [instance: 99efdaed-e22a-40ce-bc57-c953bc563db6] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 729.100778] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.101061] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.101247] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.105516] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] [instance: 99efdaed-e22a-40ce-bc57-c953bc563db6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 729.106363] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.106577] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.106807] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.121511] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.121768] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.121908] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.891522] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 730.905415] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.905645] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.905806] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.934304] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.934512] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.934675] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.935423] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.935607] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.935825] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.945584] nova-conductor[52332]: DEBUG nova.quota [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Getting quotas for project 7fe92e656d1d4a37927e32ea27ae6b2f. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 730.947075] nova-conductor[52332]: DEBUG nova.quota [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Getting quotas for user ed85fadfdf634362a8c85e9750017c50 and project 7fe92e656d1d4a37927e32ea27ae6b2f. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 730.957418] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] [instance: 8e524142-3b24-45ac-90b0-8f659bfb15b1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 730.957930] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.958554] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.958760] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.961732] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] [instance: 8e524142-3b24-45ac-90b0-8f659bfb15b1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 730.962424] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.962629] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.962794] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.977910] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.978155] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.978357] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.227371] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:INSERT=5,nova_cell1:SELECT=73,nova_cell1:UPDATE=22 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 731.375324] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Took 0.18 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 731.417481] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.417810] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.418008] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.441293] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=30,nova_cell1:SAVEPOINT=4,nova_cell1:INSERT=62,nova_cell1:RELEASE=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 731.494243] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.494581] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.494752] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.495129] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.495509] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.495721] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.505828] nova-conductor[52332]: DEBUG nova.quota [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Getting quotas for project 83a34318a55245779f44dea3ae029b51. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 731.508026] nova-conductor[52332]: DEBUG nova.quota [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Getting quotas for user 377209702331418f8cac324187c2725d and project 83a34318a55245779f44dea3ae029b51. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 731.514677] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: d1029006-8185-40ed-a8c5-ab1244873e09] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 731.515091] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.515297] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.515463] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.519075] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: d1029006-8185-40ed-a8c5-ab1244873e09] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 731.519747] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.519938] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.520116] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.535435] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.535670] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.535920] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.560636] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=77,nova_cell1:UPDATE=17,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 732.952332] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] [instance: f3032222-3305-4983-b49c-1a44ddb4b1a5] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port eae5e9e6-930f-4d1e-b18b-182fbd248080, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance f3032222-3305-4983-b49c-1a44ddb4b1a5 was re-scheduled: Binding failed for port eae5e9e6-930f-4d1e-b18b-182fbd248080, please check neutron logs for more information.\n'] [ 732.952946] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 732.953176] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f3032222-3305-4983-b49c-1a44ddb4b1a5.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f3032222-3305-4983-b49c-1a44ddb4b1a5. [ 732.953504] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] [instance: f3032222-3305-4983-b49c-1a44ddb4b1a5] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance f3032222-3305-4983-b49c-1a44ddb4b1a5. [ 732.987250] nova-conductor[52331]: DEBUG nova.network.neutron [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] [instance: f3032222-3305-4983-b49c-1a44ddb4b1a5] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 733.003886] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=73,nova_api:DELETE=6,nova_api:UPDATE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 733.003886] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=89,nova_api:UPDATE=5,nova_api:DELETE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 733.101877] nova-conductor[52331]: DEBUG nova.network.neutron [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] [instance: f3032222-3305-4983-b49c-1a44ddb4b1a5] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 733.107282] nova-conductor[52331]: DEBUG nova.network.neutron [None req-8b67c71b-7712-41a6-a357-e7f50e368a7c tempest-ServerActionsTestJSON-1242049850 tempest-ServerActionsTestJSON-1242049850-project-member] [instance: f3032222-3305-4983-b49c-1a44ddb4b1a5] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.149676] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 733.161602] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.164672] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.164672] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.003s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.197849] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.197849] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.197849] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.198234] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.198451] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.198645] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.203111] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=13 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 733.209425] nova-conductor[52331]: DEBUG nova.quota [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Getting quotas for project f4de690c6a95491e92ded8527c776043. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 733.212710] nova-conductor[52331]: DEBUG nova.quota [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Getting quotas for user 8e6631e0f3c443029647e74b6ecb55d9 and project f4de690c6a95491e92ded8527c776043. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 733.219897] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: 230594df-15e6-4750-a50b-37481f7a629d] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 733.220618] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.220878] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.221084] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.223804] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: 230594df-15e6-4750-a50b-37481f7a629d] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 733.224587] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.224826] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.225050] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.239586] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.239843] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.240064] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.642407] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=74,nova_cell1:UPDATE=20,nova_cell1:SAVEPOINT=2,nova_cell1:RELEASE=2,nova_cell1:INSERT=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 734.675748] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:INSERT=4,nova_cell1:SELECT=71,nova_cell1:UPDATE=19,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 734.991571] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Took 0.21 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 735.004459] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.004683] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.004847] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.087102] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.087326] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.087497] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.087839] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.088054] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.088245] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.099422] nova-conductor[52331]: DEBUG nova.quota [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Getting quotas for project f4de690c6a95491e92ded8527c776043. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 735.101788] nova-conductor[52331]: DEBUG nova.quota [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Getting quotas for user 8e6631e0f3c443029647e74b6ecb55d9 and project f4de690c6a95491e92ded8527c776043. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 735.108241] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: c494aa18-faf9-4ede-a76f-c4104431a5e1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 735.108241] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.108359] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.108521] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.111668] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: c494aa18-faf9-4ede-a76f-c4104431a5e1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 735.112375] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.112561] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.112719] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.114399] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Took 0.14 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 735.134546] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.134779] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.134947] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.135234] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.135442] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.135600] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.168321] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.168601] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.168767] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.169468] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.169839] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.169985] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.196560] nova-conductor[52332]: DEBUG nova.quota [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Getting quotas for project 858f3218a3de48b89a6ec9f62c0ff7fb. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 735.208749] nova-conductor[52332]: DEBUG nova.quota [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Getting quotas for user ddd2a269170b464cabb5168a2d6a2ebd and project 858f3218a3de48b89a6ec9f62c0ff7fb. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 735.220103] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 735.220643] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.220840] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.221004] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.224083] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 735.224606] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.225554] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.225903] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.241790] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.241790] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.241790] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.729707] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=72,nova_cell1:UPDATE=19,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3,nova_cell1:INSERT=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 735.785254] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] [instance: 3892ebb3-f231-452f-a0cc-c25b7201ef34] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port b322171e-e78e-4d3a-8e25-d8a9c8b8d619, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 3892ebb3-f231-452f-a0cc-c25b7201ef34 was re-scheduled: Binding failed for port b322171e-e78e-4d3a-8e25-d8a9c8b8d619, please check neutron logs for more information.\n'] [ 735.787436] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 735.787550] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3892ebb3-f231-452f-a0cc-c25b7201ef34.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3892ebb3-f231-452f-a0cc-c25b7201ef34. [ 735.787883] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] [instance: 3892ebb3-f231-452f-a0cc-c25b7201ef34] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 3892ebb3-f231-452f-a0cc-c25b7201ef34. [ 735.821629] nova-conductor[52332]: DEBUG nova.network.neutron [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] [instance: 3892ebb3-f231-452f-a0cc-c25b7201ef34] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 735.914922] nova-conductor[52332]: DEBUG nova.network.neutron [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] [instance: 3892ebb3-f231-452f-a0cc-c25b7201ef34] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 735.919014] nova-conductor[52332]: DEBUG nova.network.neutron [None req-af57d63c-722e-414d-a507-646554f880fb tempest-VolumesAssistedSnapshotsTest-1288447876 tempest-VolumesAssistedSnapshotsTest-1288447876-project-member] [instance: 3892ebb3-f231-452f-a0cc-c25b7201ef34] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.088149] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=15,nova_cell1:SELECT=71,nova_cell1:SAVEPOINT=6,nova_cell1:RELEASE=6,nova_cell1:INSERT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 736.733051] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Took 0.12 seconds to select destinations for 1 instance(s). {{(pid=52331) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 736.748528] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.748761] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.748985] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.766532] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:INSERT=66,nova_cell1:SELECT=30,nova_cell1:SAVEPOINT=2,nova_cell1:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 736.782934] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.783222] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.783452] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.783955] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.784218] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.784422] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.792292] nova-conductor[52331]: DEBUG nova.quota [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Getting quotas for project 522cbd160e594fe2b192ce3aeaa6c682. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 736.794726] nova-conductor[52331]: DEBUG nova.quota [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Getting quotas for user b93e03caf34f46a488174c088efa722f and project 522cbd160e594fe2b192ce3aeaa6c682. Resources: {'instances', 'ram', 'cores'} {{(pid=52331) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 736.801353] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 5d180089-6d96-488f-b0e8-43151f3f6902] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52331) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 736.801891] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.802132] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.802337] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.805570] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 5d180089-6d96-488f-b0e8-43151f3f6902] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 736.806303] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.806549] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.806752] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.819799] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.822385] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.822385] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.125421] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: a5880165-fe4c-4453-b045-2e4968d40f55] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port e3d8e910-ef0b-4a54-9baf-3a07179c5270, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance a5880165-fe4c-4453-b045-2e4968d40f55 was re-scheduled: Binding failed for port e3d8e910-ef0b-4a54-9baf-3a07179c5270, please check neutron logs for more information.\n'] [ 737.126026] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 737.126262] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a5880165-fe4c-4453-b045-2e4968d40f55.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a5880165-fe4c-4453-b045-2e4968d40f55. [ 737.126590] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: a5880165-fe4c-4453-b045-2e4968d40f55] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance a5880165-fe4c-4453-b045-2e4968d40f55. [ 737.178974] nova-conductor[52331]: DEBUG nova.network.neutron [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: a5880165-fe4c-4453-b045-2e4968d40f55] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 737.219969] nova-conductor[52331]: DEBUG nova.network.neutron [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: a5880165-fe4c-4453-b045-2e4968d40f55] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 737.225004] nova-conductor[52331]: DEBUG nova.network.neutron [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: a5880165-fe4c-4453-b045-2e4968d40f55] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.231370] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=70,nova_cell1:UPDATE=22,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=2,nova_cell1:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 737.243005] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: e3009964-caae-4594-b2a0-7eff079abcb2] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 19390b56-1ea4-4744-8a08-c961fef7bd1e, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance e3009964-caae-4594-b2a0-7eff079abcb2 was re-scheduled: Binding failed for port 19390b56-1ea4-4744-8a08-c961fef7bd1e, please check neutron logs for more information.\n'] [ 737.243617] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 737.243834] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e3009964-caae-4594-b2a0-7eff079abcb2.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e3009964-caae-4594-b2a0-7eff079abcb2. [ 737.244115] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: e3009964-caae-4594-b2a0-7eff079abcb2] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance e3009964-caae-4594-b2a0-7eff079abcb2. [ 737.291247] nova-conductor[52332]: DEBUG nova.network.neutron [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: e3009964-caae-4594-b2a0-7eff079abcb2] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 737.375564] nova-conductor[52332]: DEBUG nova.network.neutron [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: e3009964-caae-4594-b2a0-7eff079abcb2] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 737.379126] nova-conductor[52332]: DEBUG nova.network.neutron [None req-eb3825f8-c258-4d45-93eb-8ff47bd27b2f tempest-MultipleCreateTestJSON-371914168 tempest-MultipleCreateTestJSON-371914168-project-member] [instance: e3009964-caae-4594-b2a0-7eff079abcb2] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.648038] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Took 0.15 seconds to select destinations for 1 instance(s). {{(pid=52332) _schedule_instances /opt/stack/nova/nova/conductor/manager.py:945}} [ 737.665521] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.665827] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.666064] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.704328] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.704534] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.704766] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.705225] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.705485] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.705703] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.714550] nova-conductor[52332]: DEBUG nova.quota [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Getting quotas for project e28eb28844624ee3bc0066a93f5138cd. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:393}} [ 737.717161] nova-conductor[52332]: DEBUG nova.quota [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Getting quotas for user 082d689634a440558b9677f960cb36c6 and project e28eb28844624ee3bc0066a93f5138cd. Resources: {'instances', 'ram', 'cores'} {{(pid=52332) _get_quotas /opt/stack/nova/nova/quota.py:383}} [ 737.722825] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] [instance: 09dcef1e-75f4-4bbc-825e-c7b9668985c1] Selected host: cpu-1; Selected node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28; Alternates: [] {{(pid=52332) schedule_and_build_instances /opt/stack/nova/nova/conductor/manager.py:1761}} [ 737.723349] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.723620] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.723864] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.727562] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] [instance: 09dcef1e-75f4-4bbc-825e-c7b9668985c1] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 737.728322] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.728668] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.728915] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.745840] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 737.746172] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 737.746636] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.764620] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=22,nova_cell1:SELECT=68,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3,nova_cell1:INSERT=4 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 741.217769] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] [instance: 99efdaed-e22a-40ce-bc57-c953bc563db6] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 09163c4d-9693-4150-a1c9-3fb0543860c2, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 99efdaed-e22a-40ce-bc57-c953bc563db6 was re-scheduled: Binding failed for port 09163c4d-9693-4150-a1c9-3fb0543860c2, please check neutron logs for more information.\n'] [ 741.218459] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 741.218749] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 99efdaed-e22a-40ce-bc57-c953bc563db6.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 99efdaed-e22a-40ce-bc57-c953bc563db6. [ 741.219070] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] [instance: 99efdaed-e22a-40ce-bc57-c953bc563db6] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 99efdaed-e22a-40ce-bc57-c953bc563db6. [ 741.223550] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=77,nova_cell1:UPDATE=21,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 741.242118] nova-conductor[52332]: DEBUG nova.network.neutron [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] [instance: 99efdaed-e22a-40ce-bc57-c953bc563db6] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 741.349611] nova-conductor[52332]: Traceback (most recent call last): [ 741.349611] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 741.349611] nova-conductor[52332]: return func(*args, **kwargs) [ 741.349611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 741.349611] nova-conductor[52332]: selections = self._select_destinations( [ 741.349611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 741.349611] nova-conductor[52332]: selections = self._schedule( [ 741.349611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 741.349611] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 741.349611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 741.349611] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 741.349611] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.349611] nova-conductor[52332]: ERROR nova.conductor.manager [ 741.357215] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.357484] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.357690] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.360250] nova-conductor[52332]: DEBUG nova.network.neutron [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] [instance: 99efdaed-e22a-40ce-bc57-c953bc563db6] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 741.366222] nova-conductor[52332]: DEBUG nova.network.neutron [None req-053935a2-3662-4fb4-a85a-c3ce6389394b tempest-ServersNegativeTestJSON-401075618 tempest-ServersNegativeTestJSON-401075618-project-member] [instance: 99efdaed-e22a-40ce-bc57-c953bc563db6] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.432354] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] [instance: 08438817-4d9a-42ad-b9bb-0dc044de6c82] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 741.433201] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.433692] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.433856] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.440680] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 741.440680] nova-conductor[52332]: Traceback (most recent call last): [ 741.440680] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 741.440680] nova-conductor[52332]: return func(*args, **kwargs) [ 741.440680] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 741.440680] nova-conductor[52332]: selections = self._select_destinations( [ 741.440680] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 741.440680] nova-conductor[52332]: selections = self._schedule( [ 741.440680] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 741.440680] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 741.440680] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 741.440680] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 741.440680] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 741.440680] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 741.441407] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6f6b8bd3-ff16-427d-9f50-a81fbc1a0f88 tempest-ServersTestFqdnHostnames-869995063 tempest-ServersTestFqdnHostnames-869995063-project-member] [instance: 08438817-4d9a-42ad-b9bb-0dc044de6c82] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 741.572023] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=62,nova_cell1:UPDATE=17,nova_cell1:INSERT=3,nova_cell1:SAVEPOINT=9,nova_cell1:RELEASE=9 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 741.671038] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: 230594df-15e6-4750-a50b-37481f7a629d] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 2d94551f-6ddc-4201-b494-a883935e1787, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 230594df-15e6-4750-a50b-37481f7a629d was re-scheduled: Binding failed for port 2d94551f-6ddc-4201-b494-a883935e1787, please check neutron logs for more information.\n'] [ 741.672012] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 741.672292] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 230594df-15e6-4750-a50b-37481f7a629d.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 230594df-15e6-4750-a50b-37481f7a629d. [ 741.672768] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: 230594df-15e6-4750-a50b-37481f7a629d] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 230594df-15e6-4750-a50b-37481f7a629d. [ 741.692349] nova-conductor[52332]: DEBUG nova.network.neutron [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: 230594df-15e6-4750-a50b-37481f7a629d] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 741.840429] nova-conductor[52332]: DEBUG nova.network.neutron [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: 230594df-15e6-4750-a50b-37481f7a629d] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 741.850513] nova-conductor[52332]: DEBUG nova.network.neutron [None req-9090aa8f-3a60-484c-b5c2-23fe7dfb7c41 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: 230594df-15e6-4750-a50b-37481f7a629d] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.218434] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: d1029006-8185-40ed-a8c5-ab1244873e09] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port fe9d1342-dcf7-4d14-881e-b3b8d2c44269, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance d1029006-8185-40ed-a8c5-ab1244873e09 was re-scheduled: Binding failed for port fe9d1342-dcf7-4d14-881e-b3b8d2c44269, please check neutron logs for more information.\n'] [ 743.219007] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 743.219233] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d1029006-8185-40ed-a8c5-ab1244873e09.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d1029006-8185-40ed-a8c5-ab1244873e09. [ 743.219441] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: d1029006-8185-40ed-a8c5-ab1244873e09] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance d1029006-8185-40ed-a8c5-ab1244873e09. [ 743.241134] nova-conductor[52332]: DEBUG nova.network.neutron [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: d1029006-8185-40ed-a8c5-ab1244873e09] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 743.328586] nova-conductor[52332]: DEBUG nova.network.neutron [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: d1029006-8185-40ed-a8c5-ab1244873e09] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 743.332551] nova-conductor[52332]: DEBUG nova.network.neutron [None req-785b41e6-d7fa-41b1-a746-2d9da0237d73 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: d1029006-8185-40ed-a8c5-ab1244873e09] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 745.074780] nova-conductor[52331]: Traceback (most recent call last): [ 745.074780] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 745.074780] nova-conductor[52331]: return func(*args, **kwargs) [ 745.074780] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 745.074780] nova-conductor[52331]: selections = self._select_destinations( [ 745.074780] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 745.074780] nova-conductor[52331]: selections = self._schedule( [ 745.074780] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 745.074780] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 745.074780] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 745.074780] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 745.074780] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.074780] nova-conductor[52331]: ERROR nova.conductor.manager [ 745.082076] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.082649] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.082897] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.133552] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: cbbc45bc-9de5-4e7c-bf85-c36b3950db24] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 745.134279] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.134897] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.135419] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.138353] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 745.138353] nova-conductor[52331]: Traceback (most recent call last): [ 745.138353] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 745.138353] nova-conductor[52331]: return func(*args, **kwargs) [ 745.138353] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 745.138353] nova-conductor[52331]: selections = self._select_destinations( [ 745.138353] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 745.138353] nova-conductor[52331]: selections = self._schedule( [ 745.138353] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 745.138353] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 745.138353] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 745.138353] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 745.138353] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 745.138353] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 745.139007] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-48cd947d-5cd4-4f51-9a12-c77769a4fc77 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: cbbc45bc-9de5-4e7c-bf85-c36b3950db24] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 745.986388] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:SELECT=64,nova_cell1:UPDATE=19,nova_cell1:SAVEPOINT=9,nova_cell1:RELEASE=8 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 746.010165] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=67,nova_cell1:UPDATE=15,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=7,nova_cell1:RELEASE=7 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 746.113624] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 1e5adb0a-56e3-4f46-be38-81346f158f94, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4 was re-scheduled: Binding failed for port 1e5adb0a-56e3-4f46-be38-81346f158f94, please check neutron logs for more information.\n'] [ 746.114253] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 746.114485] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4. [ 746.114930] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4. [ 746.135665] nova-conductor[52332]: DEBUG nova.network.neutron [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 746.461704] nova-conductor[52332]: DEBUG nova.network.neutron [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 746.468083] nova-conductor[52332]: DEBUG nova.network.neutron [None req-5dace844-0acc-4597-bb17-2d3a023d2f7f tempest-AttachVolumeNegativeTest-558563133 tempest-AttachVolumeNegativeTest-558563133-project-member] [instance: 62fc6b88-cb3b-4088-9c33-2e2c7aed45d4] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.559814] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: c494aa18-faf9-4ede-a76f-c4104431a5e1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 3a43a346-974d-4095-a1eb-1e3c8f5a8689, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance c494aa18-faf9-4ede-a76f-c4104431a5e1 was re-scheduled: Binding failed for port 3a43a346-974d-4095-a1eb-1e3c8f5a8689, please check neutron logs for more information.\n'] [ 746.560396] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 746.560626] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c494aa18-faf9-4ede-a76f-c4104431a5e1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c494aa18-faf9-4ede-a76f-c4104431a5e1. [ 746.560839] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: c494aa18-faf9-4ede-a76f-c4104431a5e1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance c494aa18-faf9-4ede-a76f-c4104431a5e1. [ 746.589508] nova-conductor[52332]: DEBUG nova.network.neutron [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: c494aa18-faf9-4ede-a76f-c4104431a5e1] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 746.698443] nova-conductor[52332]: DEBUG nova.network.neutron [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: c494aa18-faf9-4ede-a76f-c4104431a5e1] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 746.706556] nova-conductor[52332]: DEBUG nova.network.neutron [None req-ab3b90e5-da23-443f-83d5-c4b621bedf71 tempest-ServerRescueNegativeTestJSON-1050757123 tempest-ServerRescueNegativeTestJSON-1050757123-project-member] [instance: c494aa18-faf9-4ede-a76f-c4104431a5e1] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.809889] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:INSERT=13,nova_cell1:SELECT=6 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 747.732203] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:INSERT=57,nova_cell1:SELECT=24,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 748.068204] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] [instance: 09dcef1e-75f4-4bbc-825e-c7b9668985c1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port 74502e60-b5da-4425-a64d-b582b6e03f90, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 09dcef1e-75f4-4bbc-825e-c7b9668985c1 was re-scheduled: Binding failed for port 74502e60-b5da-4425-a64d-b582b6e03f90, please check neutron logs for more information.\n'] [ 748.068474] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:UPDATE=17,nova_cell1:SELECT=63,nova_cell1:INSERT=4,nova_cell1:SAVEPOINT=8,nova_cell1:RELEASE=8 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 748.068804] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 748.069142] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 09dcef1e-75f4-4bbc-825e-c7b9668985c1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 09dcef1e-75f4-4bbc-825e-c7b9668985c1. [ 748.069488] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] [instance: 09dcef1e-75f4-4bbc-825e-c7b9668985c1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 09dcef1e-75f4-4bbc-825e-c7b9668985c1. [ 748.096040] nova-conductor[52331]: DEBUG nova.network.neutron [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] [instance: 09dcef1e-75f4-4bbc-825e-c7b9668985c1] deallocate_for_instance() {{(pid=52331) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 748.153710] nova-conductor[52331]: DEBUG nova.network.neutron [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] [instance: 09dcef1e-75f4-4bbc-825e-c7b9668985c1] Instance cache missing network info. {{(pid=52331) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 748.157565] nova-conductor[52331]: DEBUG nova.network.neutron [None req-335ed046-cbca-48c4-9a81-d8f5fa8821d6 tempest-ServersNegativeTestMultiTenantJSON-1400594594 tempest-ServersNegativeTestMultiTenantJSON-1400594594-project-member] [instance: 09dcef1e-75f4-4bbc-825e-c7b9668985c1] Updating instance_info_cache with network_info: [] {{(pid=52331) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 748.194787] nova-conductor[52331]: Traceback (most recent call last): [ 748.194787] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 748.194787] nova-conductor[52331]: return func(*args, **kwargs) [ 748.194787] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 748.194787] nova-conductor[52331]: selections = self._select_destinations( [ 748.194787] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 748.194787] nova-conductor[52331]: selections = self._schedule( [ 748.194787] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 748.194787] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 748.194787] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 748.194787] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 748.194787] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.194787] nova-conductor[52331]: ERROR nova.conductor.manager [ 748.201127] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 748.201367] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.201636] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.249934] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: eb2d0e5e-7b32-44d5-907c-cb12fb9965d6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 748.250703] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 748.250890] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.251085] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.254340] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 748.254340] nova-conductor[52331]: Traceback (most recent call last): [ 748.254340] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 748.254340] nova-conductor[52331]: return func(*args, **kwargs) [ 748.254340] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 748.254340] nova-conductor[52331]: selections = self._select_destinations( [ 748.254340] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 748.254340] nova-conductor[52331]: selections = self._schedule( [ 748.254340] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 748.254340] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 748.254340] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 748.254340] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 748.254340] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 748.254340] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 748.254862] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-07e4a339-a4fc-4a8d-ac26-32e1cfbf3eb9 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: eb2d0e5e-7b32-44d5-907c-cb12fb9965d6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 750.123253] nova-conductor[52332]: ERROR nova.scheduler.utils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 5d180089-6d96-488f-b0e8-43151f3f6902] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn\n vm_ref = self.build_virtual_machine(instance,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine\n vif_infos = vmwarevif.get_vif_info(self._session,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info\n for vif in network_info:\n', ' File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__\n return self._sync_wrapper(fn, *args, **kwargs)\n', ' File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper\n self.wait()\n', ' File "/opt/stack/nova/nova/network/model.py", line 635, in wait\n self[:] = self._gt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait\n return self._exit_event.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main\n result = function(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper\n return func(*args, **kwargs)\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async\n raise e\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async\n nwinfo = self.network_api.allocate_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance\n created_port_ids = self._update_ports_for_instance(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance\n with excutils.save_and_reraise_exception():\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__\n self.force_reraise()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise\n raise self.value\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance\n updated_port = self._update_port(\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port\n _ensure_no_port_binding_failure(port)\n', ' File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure\n raise exception.PortBindingFailed(port_id=port[\'id\'])\n', 'nova.exception.PortBindingFailed: Binding failed for port c9cbdad2-db6f-4ef9-973a-f541cedd64a1, please check neutron logs for more information.\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', 'nova.exception.RescheduledException: Build of instance 5d180089-6d96-488f-b0e8-43151f3f6902 was re-scheduled: Binding failed for port c9cbdad2-db6f-4ef9-973a-f541cedd64a1, please check neutron logs for more information.\n'] [ 750.123253] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Rescheduling: True {{(pid=52332) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 750.123253] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5d180089-6d96-488f-b0e8-43151f3f6902.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5d180089-6d96-488f-b0e8-43151f3f6902. [ 750.123753] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 5d180089-6d96-488f-b0e8-43151f3f6902] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 5d180089-6d96-488f-b0e8-43151f3f6902. [ 750.125303] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:RELEASE=6,nova_cell1:SELECT=67,nova_cell1:UPDATE=20,nova_cell1:INSERT=2,nova_cell1:SAVEPOINT=5 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 750.147358] nova-conductor[52332]: DEBUG nova.network.neutron [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 5d180089-6d96-488f-b0e8-43151f3f6902] deallocate_for_instance() {{(pid=52332) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} [ 750.217837] nova-conductor[52332]: DEBUG nova.network.neutron [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 5d180089-6d96-488f-b0e8-43151f3f6902] Instance cache missing network info. {{(pid=52332) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} [ 750.221333] nova-conductor[52332]: DEBUG nova.network.neutron [None req-709204c5-6a4a-47c7-8b56-826efa6dfe6c tempest-ServersTestMultiNic-1109782359 tempest-ServersTestMultiNic-1109782359-project-member] [instance: 5d180089-6d96-488f-b0e8-43151f3f6902] Updating instance_info_cache with network_info: [] {{(pid=52332) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 750.901666] nova-conductor[52332]: Traceback (most recent call last): [ 750.901666] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 750.901666] nova-conductor[52332]: return func(*args, **kwargs) [ 750.901666] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 750.901666] nova-conductor[52332]: selections = self._select_destinations( [ 750.901666] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 750.901666] nova-conductor[52332]: selections = self._schedule( [ 750.901666] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 750.901666] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 750.901666] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 750.901666] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 750.901666] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.901666] nova-conductor[52332]: ERROR nova.conductor.manager [ 750.908054] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.908275] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.908436] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.961007] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: 60125e2d-bd25-4c7c-8505-43260063e45c] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 750.962489] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.962696] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.962951] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.971734] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 750.971734] nova-conductor[52332]: Traceback (most recent call last): [ 750.971734] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 750.971734] nova-conductor[52332]: return func(*args, **kwargs) [ 750.971734] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 750.971734] nova-conductor[52332]: selections = self._select_destinations( [ 750.971734] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 750.971734] nova-conductor[52332]: selections = self._schedule( [ 750.971734] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 750.971734] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 750.971734] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 750.971734] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 750.971734] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 750.971734] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 750.971734] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-6ac988c7-f7cf-41dc-9195-0998d1999892 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: 60125e2d-bd25-4c7c-8505-43260063e45c] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.378734] nova-conductor[52331]: Traceback (most recent call last): [ 751.378734] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.378734] nova-conductor[52331]: return func(*args, **kwargs) [ 751.378734] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.378734] nova-conductor[52331]: selections = self._select_destinations( [ 751.378734] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.378734] nova-conductor[52331]: selections = self._schedule( [ 751.378734] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.378734] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 751.378734] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.378734] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 751.378734] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.378734] nova-conductor[52331]: ERROR nova.conductor.manager [ 751.385795] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.386090] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.386304] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.395908] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=42,nova_cell0:SAVEPOINT=2,nova_cell0:INSERT=46,nova_cell0:RELEASE=2,nova_cell0:UPDATE=8 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 751.431178] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] [instance: 6b8e7260-20fb-4be1-9917-2d16887e6057] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 751.431967] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.432284] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.432493] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.435431] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 751.435431] nova-conductor[52331]: Traceback (most recent call last): [ 751.435431] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 751.435431] nova-conductor[52331]: return func(*args, **kwargs) [ 751.435431] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 751.435431] nova-conductor[52331]: selections = self._select_destinations( [ 751.435431] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 751.435431] nova-conductor[52331]: selections = self._schedule( [ 751.435431] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 751.435431] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 751.435431] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 751.435431] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 751.435431] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 751.435431] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 751.436037] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-2dd365c5-4876-4257-90c7-df8de2e6da06 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] [instance: 6b8e7260-20fb-4be1-9917-2d16887e6057] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 752.935582] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=86,nova_api:DELETE=5,nova_api:UPDATE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 752.935966] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=80,nova_api:UPDATE=5,nova_api:DELETE=3 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 753.034366] nova-conductor[52332]: Traceback (most recent call last): [ 753.034366] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 753.034366] nova-conductor[52332]: return func(*args, **kwargs) [ 753.034366] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 753.034366] nova-conductor[52332]: selections = self._select_destinations( [ 753.034366] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 753.034366] nova-conductor[52332]: selections = self._schedule( [ 753.034366] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 753.034366] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 753.034366] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 753.034366] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 753.034366] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.034366] nova-conductor[52332]: ERROR nova.conductor.manager [ 753.042922] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.043189] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.043307] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.045645] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=47,nova_cell0:SAVEPOINT=1,nova_cell0:INSERT=43,nova_cell0:RELEASE=1,nova_cell0:UPDATE=8 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 753.118888] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: 306b02b2-1006-41d4-94ae-88a1abf6faab] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 753.119980] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.120154] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.120327] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.124061] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 753.124061] nova-conductor[52332]: Traceback (most recent call last): [ 753.124061] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 753.124061] nova-conductor[52332]: return func(*args, **kwargs) [ 753.124061] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 753.124061] nova-conductor[52332]: selections = self._select_destinations( [ 753.124061] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 753.124061] nova-conductor[52332]: selections = self._schedule( [ 753.124061] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 753.124061] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 753.124061] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 753.124061] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 753.124061] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 753.124061] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 753.124611] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-48dd16ce-4a1d-4a17-bb46-f4cb50788590 tempest-AttachVolumeShelveTestJSON-322943081 tempest-AttachVolumeShelveTestJSON-322943081-project-member] [instance: 306b02b2-1006-41d4-94ae-88a1abf6faab] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.065571] nova-conductor[52331]: Traceback (most recent call last): [ 754.065571] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 754.065571] nova-conductor[52331]: return func(*args, **kwargs) [ 754.065571] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 754.065571] nova-conductor[52331]: selections = self._select_destinations( [ 754.065571] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 754.065571] nova-conductor[52331]: selections = self._schedule( [ 754.065571] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 754.065571] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 754.065571] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 754.065571] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 754.065571] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.065571] nova-conductor[52331]: ERROR nova.conductor.manager [ 754.073279] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.073513] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.073813] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.133721] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: c658984d-b2eb-4ec4-ae2c-2c9fdced2370] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 754.134495] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.134695] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.134856] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.142856] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 754.142856] nova-conductor[52331]: Traceback (most recent call last): [ 754.142856] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 754.142856] nova-conductor[52331]: return func(*args, **kwargs) [ 754.142856] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 754.142856] nova-conductor[52331]: selections = self._select_destinations( [ 754.142856] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 754.142856] nova-conductor[52331]: selections = self._schedule( [ 754.142856] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 754.142856] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 754.142856] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 754.142856] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 754.142856] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 754.142856] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 754.143387] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-efceb2c0-0912-4d30-b53a-d2da43041470 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: c658984d-b2eb-4ec4-ae2c-2c9fdced2370] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 755.728312] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Acquiring lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.728312] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.728312] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Lock "59419fe5-2c5b-4004-8c8e-a51d243fef53" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 755.814879] nova-conductor[52332]: Traceback (most recent call last): [ 755.814879] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 755.814879] nova-conductor[52332]: return func(*args, **kwargs) [ 755.814879] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 755.814879] nova-conductor[52332]: selections = self._select_destinations( [ 755.814879] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 755.814879] nova-conductor[52332]: selections = self._schedule( [ 755.814879] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 755.814879] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 755.814879] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 755.814879] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 755.814879] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.814879] nova-conductor[52332]: ERROR nova.conductor.manager [ 755.838405] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.838405] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.838557] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.878624] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] [instance: 76733805-025e-4fb8-a2cd-9fddd75616d6] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 755.879471] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.879685] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.879852] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.882738] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 755.882738] nova-conductor[52332]: Traceback (most recent call last): [ 755.882738] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 755.882738] nova-conductor[52332]: return func(*args, **kwargs) [ 755.882738] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 755.882738] nova-conductor[52332]: selections = self._select_destinations( [ 755.882738] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 755.882738] nova-conductor[52332]: selections = self._schedule( [ 755.882738] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 755.882738] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 755.882738] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 755.882738] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 755.882738] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 755.882738] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 755.883465] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-9c45de38-deef-444c-a7c9-0593de23df4a tempest-ServerGroupTestJSON-1685244472 tempest-ServerGroupTestJSON-1685244472-project-member] [instance: 76733805-025e-4fb8-a2cd-9fddd75616d6] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.882944] nova-conductor[52331]: Traceback (most recent call last): [ 756.882944] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 756.882944] nova-conductor[52331]: return func(*args, **kwargs) [ 756.882944] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 756.882944] nova-conductor[52331]: selections = self._select_destinations( [ 756.882944] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 756.882944] nova-conductor[52331]: selections = self._schedule( [ 756.882944] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 756.882944] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 756.882944] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 756.882944] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 756.882944] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.882944] nova-conductor[52331]: ERROR nova.conductor.manager [ 756.891206] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.891551] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.892913] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.914918] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:INSERT=57,nova_cell0:SELECT=35,nova_cell0:UPDATE=8 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 756.947282] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: c95a7eb6-c83c-469e-9c92-098a1cadc9cb] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 756.948140] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.948350] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.948511] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.951630] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 756.951630] nova-conductor[52331]: Traceback (most recent call last): [ 756.951630] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 756.951630] nova-conductor[52331]: return func(*args, **kwargs) [ 756.951630] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 756.951630] nova-conductor[52331]: selections = self._select_destinations( [ 756.951630] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 756.951630] nova-conductor[52331]: selections = self._schedule( [ 756.951630] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 756.951630] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 756.951630] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 756.951630] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 756.951630] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 756.951630] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 756.952499] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-d4a60172-9c56-400a-94f9-63e1fd86d096 tempest-ServerDiskConfigTestJSON-1865767556 tempest-ServerDiskConfigTestJSON-1865767556-project-member] [instance: c95a7eb6-c83c-469e-9c92-098a1cadc9cb] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 757.328069] nova-conductor[52332]: Traceback (most recent call last): [ 757.328069] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 757.328069] nova-conductor[52332]: return func(*args, **kwargs) [ 757.328069] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 757.328069] nova-conductor[52332]: selections = self._select_destinations( [ 757.328069] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 757.328069] nova-conductor[52332]: selections = self._schedule( [ 757.328069] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 757.328069] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 757.328069] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 757.328069] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 757.328069] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: ERROR nova.conductor.manager [ 757.328069] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.330691] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.003s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.331061] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.342767] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=36,nova_cell0:SAVEPOINT=2,nova_cell0:INSERT=52,nova_cell0:RELEASE=2,nova_cell0:UPDATE=8 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 757.382629] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] [instance: 47821261-67e9-46ac-a06c-48cede637d6f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 757.382763] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.382988] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.383159] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.390535] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 757.390535] nova-conductor[52332]: Traceback (most recent call last): [ 757.390535] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 757.390535] nova-conductor[52332]: return func(*args, **kwargs) [ 757.390535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 757.390535] nova-conductor[52332]: selections = self._select_destinations( [ 757.390535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 757.390535] nova-conductor[52332]: selections = self._schedule( [ 757.390535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 757.390535] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 757.390535] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 757.390535] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 757.390535] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 757.390535] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 757.391079] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-639bd5ab-f139-40ae-be0a-8872267faf92 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] [instance: 47821261-67e9-46ac-a06c-48cede637d6f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.936677] nova-conductor[52331]: Traceback (most recent call last): [ 761.936677] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.936677] nova-conductor[52331]: return func(*args, **kwargs) [ 761.936677] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.936677] nova-conductor[52331]: selections = self._select_destinations( [ 761.936677] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.936677] nova-conductor[52331]: selections = self._schedule( [ 761.936677] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.936677] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 761.936677] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.936677] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 761.936677] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.936677] nova-conductor[52331]: ERROR nova.conductor.manager [ 761.945016] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.945241] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.945405] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.992268] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] [instance: ff848ebf-f82e-41d7-9f22-666d977326b5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 761.992980] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.993190] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.993356] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.998222] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 761.998222] nova-conductor[52331]: Traceback (most recent call last): [ 761.998222] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 761.998222] nova-conductor[52331]: return func(*args, **kwargs) [ 761.998222] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 761.998222] nova-conductor[52331]: selections = self._select_destinations( [ 761.998222] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 761.998222] nova-conductor[52331]: selections = self._schedule( [ 761.998222] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 761.998222] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 761.998222] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 761.998222] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 761.998222] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 761.998222] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 761.998222] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-49449907-6b61-4d0a-b7ca-e34a3db450b9 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] [instance: ff848ebf-f82e-41d7-9f22-666d977326b5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 762.556611] nova-conductor[52332]: Traceback (most recent call last): [ 762.556611] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 762.556611] nova-conductor[52332]: return func(*args, **kwargs) [ 762.556611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 762.556611] nova-conductor[52332]: selections = self._select_destinations( [ 762.556611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 762.556611] nova-conductor[52332]: selections = self._schedule( [ 762.556611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 762.556611] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 762.556611] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 762.556611] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 762.556611] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.556611] nova-conductor[52332]: ERROR nova.conductor.manager [ 762.565135] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.565379] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.565607] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.624594] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] [instance: 0dd37cac-91e8-4024-8458-448a307fa32f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 762.625493] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.625493] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.625676] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.628892] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 762.628892] nova-conductor[52332]: Traceback (most recent call last): [ 762.628892] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 762.628892] nova-conductor[52332]: return func(*args, **kwargs) [ 762.628892] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 762.628892] nova-conductor[52332]: selections = self._select_destinations( [ 762.628892] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 762.628892] nova-conductor[52332]: selections = self._schedule( [ 762.628892] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 762.628892] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 762.628892] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 762.628892] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 762.628892] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 762.628892] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 762.629452] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-ca76f376-fba0-41c3-867e-5e0d3b47ac35 tempest-AttachInterfacesTestJSON-1729323615 tempest-AttachInterfacesTestJSON-1729323615-project-member] [instance: 0dd37cac-91e8-4024-8458-448a307fa32f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.094714] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=68,nova_api:DELETE=2,nova_api:UPDATE=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 764.095413] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=64,nova_api:UPDATE=5,nova_api:DELETE=7 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.173258] nova-conductor[52331]: Traceback (most recent call last): [ 764.173258] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.173258] nova-conductor[52331]: return func(*args, **kwargs) [ 764.173258] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.173258] nova-conductor[52331]: selections = self._select_destinations( [ 764.173258] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.173258] nova-conductor[52331]: selections = self._schedule( [ 764.173258] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.173258] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 764.173258] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.173258] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 764.173258] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.173258] nova-conductor[52331]: ERROR nova.conductor.manager [ 764.176157] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.176592] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.176955] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.223958] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=41,nova_cell0:INSERT=46,nova_cell0:UPDATE=9,nova_cell0:SAVEPOINT=2,nova_cell0:RELEASE=2 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 764.228660] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] [instance: 89eec0b1-cc0f-44d1-9cf5-02f6fbc4d28f] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 764.229627] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.230015] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.230537] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.235910] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 764.235910] nova-conductor[52331]: Traceback (most recent call last): [ 764.235910] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.235910] nova-conductor[52331]: return func(*args, **kwargs) [ 764.235910] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.235910] nova-conductor[52331]: selections = self._select_destinations( [ 764.235910] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.235910] nova-conductor[52331]: selections = self._schedule( [ 764.235910] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.235910] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 764.235910] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.235910] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 764.235910] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.235910] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.237240] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-6f3b0368-646e-4c94-ac86-b54e5c04368e tempest-ServerActionsTestOtherB-1727580260 tempest-ServerActionsTestOtherB-1727580260-project-member] [instance: 89eec0b1-cc0f-44d1-9cf5-02f6fbc4d28f] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.745424] nova-conductor[52332]: Traceback (most recent call last): [ 764.745424] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.745424] nova-conductor[52332]: return func(*args, **kwargs) [ 764.745424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.745424] nova-conductor[52332]: selections = self._select_destinations( [ 764.745424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.745424] nova-conductor[52332]: selections = self._schedule( [ 764.745424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.745424] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 764.745424] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.745424] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 764.745424] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.745424] nova-conductor[52332]: ERROR nova.conductor.manager [ 764.752219] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.752437] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.752599] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.794825] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:INSERT=52,nova_cell0:SELECT=39,nova_cell0:UPDATE=9 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 764.803707] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] [instance: a471f371-0c65-4d70-bcc1-71da26f39006] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 764.803707] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.803707] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.803707] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.807838] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 764.807838] nova-conductor[52332]: Traceback (most recent call last): [ 764.807838] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 764.807838] nova-conductor[52332]: return func(*args, **kwargs) [ 764.807838] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 764.807838] nova-conductor[52332]: selections = self._select_destinations( [ 764.807838] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 764.807838] nova-conductor[52332]: selections = self._schedule( [ 764.807838] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 764.807838] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 764.807838] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 764.807838] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 764.807838] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 764.807838] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 764.808672] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-e66f0efe-106e-4a20-bbad-d2b17d3b9bd2 tempest-AttachVolumeTestJSON-1669873106 tempest-AttachVolumeTestJSON-1669873106-project-member] [instance: a471f371-0c65-4d70-bcc1-71da26f39006] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.834471] nova-conductor[52331]: Traceback (most recent call last): [ 769.834471] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 769.834471] nova-conductor[52331]: return func(*args, **kwargs) [ 769.834471] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 769.834471] nova-conductor[52331]: selections = self._select_destinations( [ 769.834471] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 769.834471] nova-conductor[52331]: selections = self._schedule( [ 769.834471] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 769.834471] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 769.834471] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 769.834471] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 769.834471] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.834471] nova-conductor[52331]: ERROR nova.conductor.manager [ 769.841496] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.841706] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.841875] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.886380] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] [instance: 732d1d9d-0d8a-48c6-b7de-6bfb10befa61] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 769.887299] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.887522] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.001s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.887693] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.895371] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 769.895371] nova-conductor[52331]: Traceback (most recent call last): [ 769.895371] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 769.895371] nova-conductor[52331]: return func(*args, **kwargs) [ 769.895371] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 769.895371] nova-conductor[52331]: selections = self._select_destinations( [ 769.895371] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 769.895371] nova-conductor[52331]: selections = self._schedule( [ 769.895371] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 769.895371] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 769.895371] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 769.895371] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 769.895371] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 769.895371] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 769.896066] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-06fe9959-aa99-4439-aee8-1af84076104c tempest-AttachInterfacesUnderV243Test-1379856390 tempest-AttachInterfacesUnderV243Test-1379856390-project-member] [instance: 732d1d9d-0d8a-48c6-b7de-6bfb10befa61] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 771.870436] nova-conductor[52332]: Traceback (most recent call last): [ 771.870436] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 771.870436] nova-conductor[52332]: return func(*args, **kwargs) [ 771.870436] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 771.870436] nova-conductor[52332]: selections = self._select_destinations( [ 771.870436] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 771.870436] nova-conductor[52332]: selections = self._schedule( [ 771.870436] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 771.870436] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 771.870436] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 771.870436] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 771.870436] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.870436] nova-conductor[52332]: ERROR nova.conductor.manager [ 771.877436] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 771.877626] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 771.877788] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 771.919258] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: 8a4737ec-bb6e-40ff-83db-7566cc3dc71a] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 771.920107] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 771.920439] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 771.920716] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 771.925903] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 771.925903] nova-conductor[52332]: Traceback (most recent call last): [ 771.925903] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 771.925903] nova-conductor[52332]: return func(*args, **kwargs) [ 771.925903] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 771.925903] nova-conductor[52332]: selections = self._select_destinations( [ 771.925903] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 771.925903] nova-conductor[52332]: selections = self._schedule( [ 771.925903] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 771.925903] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 771.925903] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 771.925903] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 771.925903] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 771.925903] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 771.926938] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-9b320091-41fd-4a03-b293-f7432ea04237 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: 8a4737ec-bb6e-40ff-83db-7566cc3dc71a] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.502384] nova-conductor[52331]: Traceback (most recent call last): [ 773.502384] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.502384] nova-conductor[52331]: return func(*args, **kwargs) [ 773.502384] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.502384] nova-conductor[52331]: selections = self._select_destinations( [ 773.502384] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.502384] nova-conductor[52331]: selections = self._schedule( [ 773.502384] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.502384] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 773.502384] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.502384] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 773.502384] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.502384] nova-conductor[52331]: ERROR nova.conductor.manager [ 773.509224] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.509693] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.509693] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.548176] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: d3eb0204-49aa-44c6-813b-477dde377fc2] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 773.548900] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.549136] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.549286] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.552390] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 773.552390] nova-conductor[52331]: Traceback (most recent call last): [ 773.552390] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.552390] nova-conductor[52331]: return func(*args, **kwargs) [ 773.552390] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.552390] nova-conductor[52331]: selections = self._select_destinations( [ 773.552390] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.552390] nova-conductor[52331]: selections = self._schedule( [ 773.552390] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.552390] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 773.552390] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.552390] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 773.552390] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.552390] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.553243] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: d3eb0204-49aa-44c6-813b-477dde377fc2] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.574670] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.574803] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.574990] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8a256ef5-0c4a-4239-a961-7fee09278c43 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.625235] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:UPDATE=11,nova_cell0:SELECT=42,nova_cell0:INSERT=45,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.908815] nova-conductor[52332]: Traceback (most recent call last): [ 773.908815] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.908815] nova-conductor[52332]: return func(*args, **kwargs) [ 773.908815] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.908815] nova-conductor[52332]: selections = self._select_destinations( [ 773.908815] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.908815] nova-conductor[52332]: selections = self._schedule( [ 773.908815] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.908815] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 773.908815] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.908815] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 773.908815] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.908815] nova-conductor[52332]: ERROR nova.conductor.manager [ 773.915489] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.915727] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.915896] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.955926] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: d2919051-8763-425f-b719-817e123f7f99] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 773.957015] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.957015] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.957015] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.959610] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 773.959610] nova-conductor[52332]: Traceback (most recent call last): [ 773.959610] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 773.959610] nova-conductor[52332]: return func(*args, **kwargs) [ 773.959610] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 773.959610] nova-conductor[52332]: selections = self._select_destinations( [ 773.959610] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 773.959610] nova-conductor[52332]: selections = self._schedule( [ 773.959610] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 773.959610] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 773.959610] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 773.959610] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 773.959610] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 773.959610] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.960248] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-c28658a9-c23d-4bb5-925a-975c815bb0ec tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: d2919051-8763-425f-b719-817e123f7f99] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 773.969810] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:INSERT=45,nova_cell0:SELECT=42,nova_cell0:UPDATE=11,nova_cell0:SAVEPOINT=1,nova_cell0:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 774.105773] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell1:UPDATE=4,nova_cell1:SELECT=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 774.192735] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=32,nova_cell1:UPDATE=8,nova_cell1:SAVEPOINT=3,nova_cell1:RELEASE=3,nova_cell1:INSERT=2 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.660454] nova-conductor[52331]: Traceback (most recent call last): [ 775.660454] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.660454] nova-conductor[52331]: return func(*args, **kwargs) [ 775.660454] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.660454] nova-conductor[52331]: selections = self._select_destinations( [ 775.660454] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.660454] nova-conductor[52331]: selections = self._schedule( [ 775.660454] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.660454] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 775.660454] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.660454] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 775.660454] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.660454] nova-conductor[52331]: ERROR nova.conductor.manager [ 775.668944] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.669177] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.669327] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.702451] nova-conductor[52332]: Traceback (most recent call last): [ 775.702451] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.702451] nova-conductor[52332]: return func(*args, **kwargs) [ 775.702451] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.702451] nova-conductor[52332]: selections = self._select_destinations( [ 775.702451] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.702451] nova-conductor[52332]: selections = self._schedule( [ 775.702451] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.702451] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 775.702451] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.702451] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 775.702451] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.702451] nova-conductor[52332]: ERROR nova.conductor.manager [ 775.708936] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.709147] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.709306] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.717588] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] [instance: 7bc3cbef-0dc9-485a-bf96-8aaa85ce6955] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 775.718512] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.718880] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.719233] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.724355] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 775.724355] nova-conductor[52331]: Traceback (most recent call last): [ 775.724355] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.724355] nova-conductor[52331]: return func(*args, **kwargs) [ 775.724355] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.724355] nova-conductor[52331]: selections = self._select_destinations( [ 775.724355] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.724355] nova-conductor[52331]: selections = self._schedule( [ 775.724355] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.724355] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 775.724355] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.724355] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 775.724355] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.724355] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.725454] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-8cf567d3-45d6-445e-86e7-c69dee2dba45 tempest-ServerMetadataNegativeTestJSON-1764683187 tempest-ServerMetadataNegativeTestJSON-1764683187-project-member] [instance: 7bc3cbef-0dc9-485a-bf96-8aaa85ce6955] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.749004] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: 468319be-98cf-4b73-9e3d-b6b0ce5f2778] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 775.749707] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.749909] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.750075] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 775.752763] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 775.752763] nova-conductor[52332]: Traceback (most recent call last): [ 775.752763] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 775.752763] nova-conductor[52332]: return func(*args, **kwargs) [ 775.752763] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 775.752763] nova-conductor[52332]: selections = self._select_destinations( [ 775.752763] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 775.752763] nova-conductor[52332]: selections = self._schedule( [ 775.752763] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 775.752763] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 775.752763] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 775.752763] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 775.752763] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 775.752763] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 775.753261] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-e41a91e3-751d-455f-a7a5-fbb62bc9c7ee tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: 468319be-98cf-4b73-9e3d-b6b0ce5f2778] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.513843] nova-conductor[52331]: Traceback (most recent call last): [ 777.513843] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.513843] nova-conductor[52331]: return func(*args, **kwargs) [ 777.513843] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.513843] nova-conductor[52331]: selections = self._select_destinations( [ 777.513843] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.513843] nova-conductor[52331]: selections = self._schedule( [ 777.513843] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.513843] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 777.513843] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.513843] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 777.513843] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.513843] nova-conductor[52331]: ERROR nova.conductor.manager [ 777.520455] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 777.520671] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 777.520840] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.558881] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: cc727a02-f515-4aeb-abd2-80d4ec2595ea] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 777.559773] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 777.560108] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 777.560341] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.563598] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 777.563598] nova-conductor[52331]: Traceback (most recent call last): [ 777.563598] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 777.563598] nova-conductor[52331]: return func(*args, **kwargs) [ 777.563598] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 777.563598] nova-conductor[52331]: selections = self._select_destinations( [ 777.563598] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 777.563598] nova-conductor[52331]: selections = self._schedule( [ 777.563598] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 777.563598] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 777.563598] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 777.563598] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 777.563598] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 777.563598] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.564497] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-37c3fa0c-1933-4502-a86f-84e89e5c30fc tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: cc727a02-f515-4aeb-abd2-80d4ec2595ea] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 777.596168] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=35,nova_cell0:UPDATE=19,nova_cell0:DELETE=2,nova_cell0:SAVEPOINT=1,nova_cell0:INSERT=42,nova_cell0:RELEASE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.078090] nova-conductor[52332]: Traceback (most recent call last): [ 779.078090] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.078090] nova-conductor[52332]: return func(*args, **kwargs) [ 779.078090] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.078090] nova-conductor[52332]: selections = self._select_destinations( [ 779.078090] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.078090] nova-conductor[52332]: selections = self._schedule( [ 779.078090] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.078090] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 779.078090] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.078090] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 779.078090] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager result = self.transport._send( [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager raise result [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager Traceback (most recent call last): [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._select_destinations( [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager selections = self._schedule( [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.078090] nova-conductor[52332]: ERROR nova.conductor.manager [ 779.085338] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.085539] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.085699] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.120183] nova-conductor[52332]: DEBUG nova.conductor.manager [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: d48f2ebd-4a1c-4fe6-866e-49b03c8aa867] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52332) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 779.120910] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.121104] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.121271] nova-conductor[52332]: DEBUG oslo_concurrency.lockutils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52332) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.123889] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 779.123889] nova-conductor[52332]: Traceback (most recent call last): [ 779.123889] nova-conductor[52332]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 779.123889] nova-conductor[52332]: return func(*args, **kwargs) [ 779.123889] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 779.123889] nova-conductor[52332]: selections = self._select_destinations( [ 779.123889] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 779.123889] nova-conductor[52332]: selections = self._schedule( [ 779.123889] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 779.123889] nova-conductor[52332]: self._ensure_sufficient_hosts( [ 779.123889] nova-conductor[52332]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 779.123889] nova-conductor[52332]: raise exception.NoValidHost(reason=reason) [ 779.123889] nova-conductor[52332]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 779.123889] nova-conductor[52332]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 779.124655] nova-conductor[52332]: WARNING nova.scheduler.utils [None req-dafb7808-f496-4fbe-b7ac-5997c2772abe tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: d48f2ebd-4a1c-4fe6-866e-49b03c8aa867] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to schedule instances: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 780.633253] nova-conductor[52331]: Traceback (most recent call last): [ 780.633253] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 780.633253] nova-conductor[52331]: return func(*args, **kwargs) [ 780.633253] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 780.633253] nova-conductor[52331]: selections = self._select_destinations( [ 780.633253] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 780.633253] nova-conductor[52331]: selections = self._schedule( [ 780.633253] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 780.633253] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 780.633253] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 780.633253] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 780.633253] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 1654, in schedule_and_build_instances [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self._schedule_instances(context, request_specs[0], [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/conductor/manager.py", line 942, in _schedule_instances [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager host_lists = self.query_client.select_destinations( [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/client/query.py", line 41, in select_destinations [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager return self.scheduler_rpcapi.select_destinations(context, spec_obj, [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/rpcapi.py", line 160, in select_destinations [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager return cctxt.call(ctxt, 'select_destinations', **msg_args) [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager result = self.transport._send( [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager return self._driver.send(target, ctxt, message, [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager return self._send(target, ctxt, message, wait_for_reply, timeout, [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager raise result [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager Traceback (most recent call last): [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager return func(*args, **kwargs) [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._select_destinations( [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager selections = self._schedule( [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager self._ensure_sufficient_hosts( [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager raise exception.NoValidHost(reason=reason) [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.633253] nova-conductor[52331]: ERROR nova.conductor.manager [ 780.640371] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 780.640627] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 780.640764] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 780.676195] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: 5d92ea70-b448-43d3-b594-3a2566b247f5] block_device_mapping [BlockDeviceMapping(attachment_id=,boot_index=0,connection_info=None,created_at=,delete_on_termination=True,deleted=,deleted_at=,destination_type='local',device_name=None,device_type='disk',disk_bus=None,encrypted=False,encryption_format=None,encryption_options=None,encryption_secret_uuid=None,guest_format=None,id=,image_id='ec0e7146-a242-44d5-85e8-d8242ece1b40',instance=,instance_uuid=,no_device=False,snapshot_id=None,source_type='image',tag=None,updated_at=,uuid=,volume_id=None,volume_size=None,volume_type=None)] {{(pid=52331) _create_block_device_mapping /opt/stack/nova/nova/conductor/manager.py:1506}} [ 780.676849] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 780.677054] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 780.677222] nova-conductor[52331]: DEBUG oslo_concurrency.lockutils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.000s {{(pid=52331) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 780.679903] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. [ 780.679903] nova-conductor[52331]: Traceback (most recent call last): [ 780.679903] nova-conductor[52331]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 244, in inner [ 780.679903] nova-conductor[52331]: return func(*args, **kwargs) [ 780.679903] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 224, in select_destinations [ 780.679903] nova-conductor[52331]: selections = self._select_destinations( [ 780.679903] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 251, in _select_destinations [ 780.679903] nova-conductor[52331]: selections = self._schedule( [ 780.679903] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 452, in _schedule [ 780.679903] nova-conductor[52331]: self._ensure_sufficient_hosts( [ 780.679903] nova-conductor[52331]: File "/opt/stack/nova/nova/scheduler/manager.py", line 499, in _ensure_sufficient_hosts [ 780.679903] nova-conductor[52331]: raise exception.NoValidHost(reason=reason) [ 780.679903] nova-conductor[52331]: nova.exception.NoValidHost: No valid host was found. There are not enough hosts available. [ 780.679903] nova-conductor[52331]: : nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 780.680610] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-c710647b-f2d0-4d26-b9cb-4cb178d41280 tempest-ServersTestJSON-1598991444 tempest-ServersTestJSON-1598991444-project-member] [instance: 5d92ea70-b448-43d3-b594-3a2566b247f5] Setting instance to ERROR state.: nova.exception_Remote.NoValidHost_Remote: No valid host was found. There are not enough hosts available. [ 784.096723] nova-conductor[52331]: ERROR nova.scheduler.utils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] [instance: 8e524142-3b24-45ac-90b0-8f659bfb15b1] Error from last host: cpu-1 (node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28): ['Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance\n self.driver.spawn(context, instance, image_meta,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 537, in spawn\n self._vmops.spawn(context, instance, image_meta, injected_files,\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn\n self._fetch_image_if_missing(context, vi)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing\n image_cache(vi, tmp_image_ds_loc)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image\n vm_util.copy_virtual_disk(\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk\n session._wait_for_task(vmdk_copy_task)\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task\n return self.wait_for_task(task_ref)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner\n self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task\n raise exceptions.translate_fault(task_info.error)\n', "oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance\n self._build_and_run_instance(context, instance, image,\n', ' File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance\n raise exception.RescheduledException(\n', "nova.exception.RescheduledException: Build of instance 8e524142-3b24-45ac-90b0-8f659bfb15b1 was re-scheduled: A specified parameter was not correct: fileType\nFaults: ['InvalidArgument']\n"] [ 784.097691] nova-conductor[52331]: DEBUG nova.conductor.manager [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Rescheduling: True {{(pid=52331) build_instances /opt/stack/nova/nova/conductor/manager.py:695}} [ 784.098056] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] Failed to compute_task_build_instances: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8e524142-3b24-45ac-90b0-8f659bfb15b1.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8e524142-3b24-45ac-90b0-8f659bfb15b1. [ 784.098445] nova-conductor[52331]: WARNING nova.scheduler.utils [None req-afd2d988-4ac3-452c-866e-81965d82d8fd tempest-ServerShowV254Test-1884030427 tempest-ServerShowV254Test-1884030427-project-member] [instance: 8e524142-3b24-45ac-90b0-8f659bfb15b1] Setting instance to ERROR state.: nova.exception.MaxRetriesExceeded: Exceeded maximum number of retries. Exhausted all hosts available for retrying build failures for instance 8e524142-3b24-45ac-90b0-8f659bfb15b1. [ 789.107096] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=52,nova_api:DELETE=2,nova_api:UPDATE=6 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 789.137908] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell0:SELECT=35,nova_cell0:INSERT=43,nova_cell0:UPDATE=8 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 789.142320] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_api:SELECT=37,nova_api:UPDATE=3,nova_api:DELETE=7 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 790.663642] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=28,nova_api:DELETE=3,nova_api:UPDATE=1 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 790.693712] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_cell0:SELECT=18,nova_cell0:INSERT=22,nova_cell0:UPDATE=4 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 790.698668] nova-conductor[52331]: DEBUG dbcounter [-] [52331] Writing DB stats nova_api:SELECT=30,nova_api:UPDATE=5,nova_api:DELETE=3 {{(pid=52331) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 794.200801] nova-conductor[52332]: DEBUG dbcounter [-] [52332] Writing DB stats nova_cell1:SELECT=25,nova_cell1:UPDATE=7,nova_cell1:SAVEPOINT=1,nova_cell1:RELEASE=1 {{(pid=52332) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}}